Oct 08 21:48:19 crc systemd[1]: Starting Kubernetes Kubelet... Oct 08 21:48:19 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:19 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 21:48:20 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 08 21:48:21 crc kubenswrapper[4739]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.512521 4739 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515211 4739 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515226 4739 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515231 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515236 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515240 4739 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515244 4739 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515247 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515251 4739 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515255 4739 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515259 4739 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515262 4739 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515266 4739 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515270 4739 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515273 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515276 4739 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515280 4739 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515283 4739 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515287 4739 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515290 4739 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515294 4739 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515298 4739 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515302 4739 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515305 4739 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515310 4739 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515314 4739 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515319 4739 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515322 4739 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515326 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515330 4739 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515333 4739 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515337 4739 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515340 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515343 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515347 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515350 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515354 4739 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515357 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515361 4739 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515365 4739 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515369 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515372 4739 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515376 4739 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515379 4739 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515383 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515386 4739 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515390 4739 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515393 4739 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515397 4739 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515400 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515405 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515409 4739 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515413 4739 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515418 4739 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515422 4739 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515427 4739 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515431 4739 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515434 4739 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515438 4739 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515442 4739 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515446 4739 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515450 4739 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515454 4739 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515457 4739 feature_gate.go:330] unrecognized feature gate: Example Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515462 4739 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515467 4739 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515471 4739 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515475 4739 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515479 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515483 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515486 4739 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.515490 4739 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516289 4739 flags.go:64] FLAG: --address="0.0.0.0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516303 4739 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516312 4739 flags.go:64] FLAG: --anonymous-auth="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516324 4739 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516332 4739 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516337 4739 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516345 4739 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516351 4739 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516356 4739 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516362 4739 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516367 4739 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516373 4739 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516378 4739 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516383 4739 flags.go:64] FLAG: --cgroup-root="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516387 4739 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516392 4739 flags.go:64] FLAG: --client-ca-file="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516397 4739 flags.go:64] FLAG: --cloud-config="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516404 4739 flags.go:64] FLAG: --cloud-provider="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516408 4739 flags.go:64] FLAG: --cluster-dns="[]" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516415 4739 flags.go:64] FLAG: --cluster-domain="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516419 4739 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516423 4739 flags.go:64] FLAG: --config-dir="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516427 4739 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516431 4739 flags.go:64] FLAG: --container-log-max-files="5" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516437 4739 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516441 4739 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516446 4739 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516450 4739 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516454 4739 flags.go:64] FLAG: --contention-profiling="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516458 4739 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516463 4739 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516468 4739 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516472 4739 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516478 4739 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516482 4739 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516486 4739 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516491 4739 flags.go:64] FLAG: --enable-load-reader="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516496 4739 flags.go:64] FLAG: --enable-server="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516501 4739 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516508 4739 flags.go:64] FLAG: --event-burst="100" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516513 4739 flags.go:64] FLAG: --event-qps="50" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516518 4739 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516523 4739 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516528 4739 flags.go:64] FLAG: --eviction-hard="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516535 4739 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516541 4739 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516546 4739 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516551 4739 flags.go:64] FLAG: --eviction-soft="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516557 4739 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516562 4739 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516566 4739 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516570 4739 flags.go:64] FLAG: --experimental-mounter-path="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516574 4739 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516578 4739 flags.go:64] FLAG: --fail-swap-on="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516582 4739 flags.go:64] FLAG: --feature-gates="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516588 4739 flags.go:64] FLAG: --file-check-frequency="20s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516593 4739 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516597 4739 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516601 4739 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516606 4739 flags.go:64] FLAG: --healthz-port="10248" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516611 4739 flags.go:64] FLAG: --help="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516617 4739 flags.go:64] FLAG: --hostname-override="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516622 4739 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516627 4739 flags.go:64] FLAG: --http-check-frequency="20s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516632 4739 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516637 4739 flags.go:64] FLAG: --image-credential-provider-config="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516642 4739 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516648 4739 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516654 4739 flags.go:64] FLAG: --image-service-endpoint="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516659 4739 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516664 4739 flags.go:64] FLAG: --kube-api-burst="100" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516669 4739 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516674 4739 flags.go:64] FLAG: --kube-api-qps="50" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516678 4739 flags.go:64] FLAG: --kube-reserved="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516682 4739 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516686 4739 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516691 4739 flags.go:64] FLAG: --kubelet-cgroups="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516696 4739 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516701 4739 flags.go:64] FLAG: --lock-file="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516706 4739 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516711 4739 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516716 4739 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516725 4739 flags.go:64] FLAG: --log-json-split-stream="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516730 4739 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516735 4739 flags.go:64] FLAG: --log-text-split-stream="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516739 4739 flags.go:64] FLAG: --logging-format="text" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516744 4739 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516750 4739 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516755 4739 flags.go:64] FLAG: --manifest-url="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516760 4739 flags.go:64] FLAG: --manifest-url-header="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516767 4739 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516773 4739 flags.go:64] FLAG: --max-open-files="1000000" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516780 4739 flags.go:64] FLAG: --max-pods="110" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516785 4739 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516791 4739 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516796 4739 flags.go:64] FLAG: --memory-manager-policy="None" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516801 4739 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516807 4739 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516812 4739 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516817 4739 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516874 4739 flags.go:64] FLAG: --node-status-max-images="50" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516881 4739 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516887 4739 flags.go:64] FLAG: --oom-score-adj="-999" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516892 4739 flags.go:64] FLAG: --pod-cidr="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516897 4739 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516905 4739 flags.go:64] FLAG: --pod-manifest-path="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516910 4739 flags.go:64] FLAG: --pod-max-pids="-1" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516916 4739 flags.go:64] FLAG: --pods-per-core="0" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516921 4739 flags.go:64] FLAG: --port="10250" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516926 4739 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516931 4739 flags.go:64] FLAG: --provider-id="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516936 4739 flags.go:64] FLAG: --qos-reserved="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516941 4739 flags.go:64] FLAG: --read-only-port="10255" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516946 4739 flags.go:64] FLAG: --register-node="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516952 4739 flags.go:64] FLAG: --register-schedulable="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516957 4739 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516966 4739 flags.go:64] FLAG: --registry-burst="10" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516971 4739 flags.go:64] FLAG: --registry-qps="5" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516976 4739 flags.go:64] FLAG: --reserved-cpus="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516981 4739 flags.go:64] FLAG: --reserved-memory="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516987 4739 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516992 4739 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.516997 4739 flags.go:64] FLAG: --rotate-certificates="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517002 4739 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517007 4739 flags.go:64] FLAG: --runonce="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517012 4739 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517017 4739 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517022 4739 flags.go:64] FLAG: --seccomp-default="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517026 4739 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517031 4739 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517036 4739 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517040 4739 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517044 4739 flags.go:64] FLAG: --storage-driver-password="root" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517048 4739 flags.go:64] FLAG: --storage-driver-secure="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517052 4739 flags.go:64] FLAG: --storage-driver-table="stats" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517057 4739 flags.go:64] FLAG: --storage-driver-user="root" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517061 4739 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517065 4739 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517069 4739 flags.go:64] FLAG: --system-cgroups="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517072 4739 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517079 4739 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517083 4739 flags.go:64] FLAG: --tls-cert-file="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517087 4739 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517093 4739 flags.go:64] FLAG: --tls-min-version="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517097 4739 flags.go:64] FLAG: --tls-private-key-file="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517101 4739 flags.go:64] FLAG: --topology-manager-policy="none" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517105 4739 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517109 4739 flags.go:64] FLAG: --topology-manager-scope="container" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517113 4739 flags.go:64] FLAG: --v="2" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517119 4739 flags.go:64] FLAG: --version="false" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517125 4739 flags.go:64] FLAG: --vmodule="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517130 4739 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517134 4739 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517241 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517248 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517253 4739 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517257 4739 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517263 4739 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517268 4739 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517272 4739 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517283 4739 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517286 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517290 4739 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517295 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517302 4739 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517308 4739 feature_gate.go:330] unrecognized feature gate: Example Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517313 4739 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517318 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517322 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517327 4739 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517332 4739 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517336 4739 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517339 4739 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517344 4739 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517347 4739 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517351 4739 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517355 4739 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517358 4739 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517362 4739 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517365 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517369 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517372 4739 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517378 4739 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517381 4739 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517385 4739 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517388 4739 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517392 4739 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517395 4739 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517399 4739 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517403 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517406 4739 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517410 4739 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517416 4739 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517419 4739 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517423 4739 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517428 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517432 4739 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517437 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517441 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517446 4739 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517450 4739 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517453 4739 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517457 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517461 4739 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517465 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517470 4739 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517473 4739 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517477 4739 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517480 4739 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517484 4739 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517487 4739 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517491 4739 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517494 4739 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517497 4739 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517503 4739 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517506 4739 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517510 4739 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517513 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517517 4739 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517520 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517524 4739 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517527 4739 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517531 4739 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.517535 4739 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.517551 4739 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.537472 4739 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.537521 4739 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537634 4739 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537645 4739 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537651 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537657 4739 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537663 4739 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537670 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537676 4739 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537681 4739 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537686 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537692 4739 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537698 4739 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537703 4739 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537709 4739 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537714 4739 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537720 4739 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537726 4739 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537733 4739 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537738 4739 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537743 4739 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537749 4739 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537754 4739 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537760 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537765 4739 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537771 4739 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537778 4739 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537786 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537793 4739 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537801 4739 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537807 4739 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537813 4739 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537819 4739 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537826 4739 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537831 4739 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537837 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537843 4739 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537849 4739 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537854 4739 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537859 4739 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537864 4739 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537870 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537875 4739 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537881 4739 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537887 4739 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537893 4739 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537898 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537903 4739 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537909 4739 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537914 4739 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537921 4739 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537929 4739 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537934 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537940 4739 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537945 4739 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537951 4739 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537956 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537962 4739 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537967 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537975 4739 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537981 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537986 4739 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537993 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.537999 4739 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538006 4739 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538015 4739 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538021 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538027 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538034 4739 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538041 4739 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538047 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538053 4739 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538059 4739 feature_gate.go:330] unrecognized feature gate: Example Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.538070 4739 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538293 4739 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538306 4739 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538315 4739 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538322 4739 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538328 4739 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538334 4739 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538340 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538347 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538352 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538358 4739 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538364 4739 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538369 4739 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538375 4739 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538381 4739 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538387 4739 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538393 4739 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538398 4739 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538404 4739 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538409 4739 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538414 4739 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538420 4739 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538425 4739 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538430 4739 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538435 4739 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538440 4739 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538448 4739 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538455 4739 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538461 4739 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538467 4739 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538473 4739 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538479 4739 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538485 4739 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538490 4739 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538495 4739 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538500 4739 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538505 4739 feature_gate.go:330] unrecognized feature gate: Example Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538511 4739 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538517 4739 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538522 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538527 4739 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538533 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538538 4739 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538543 4739 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538549 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538554 4739 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538560 4739 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538565 4739 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538571 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538576 4739 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538582 4739 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538587 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538592 4739 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538598 4739 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538603 4739 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538608 4739 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538613 4739 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538618 4739 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538624 4739 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538629 4739 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538634 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538639 4739 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538644 4739 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538650 4739 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538656 4739 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538664 4739 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538670 4739 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538676 4739 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538682 4739 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538688 4739 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538694 4739 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.538700 4739 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.538708 4739 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.540016 4739 server.go:940] "Client rotation is on, will bootstrap in background" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.545238 4739 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.545355 4739 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.547225 4739 server.go:997] "Starting client certificate rotation" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.547263 4739 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.548550 4739 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-05 06:25:42.534233431 +0000 UTC Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.548727 4739 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1376h37m20.985512353s for next certificate rotation Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.578698 4739 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.582011 4739 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.608700 4739 log.go:25] "Validated CRI v1 runtime API" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.659912 4739 log.go:25] "Validated CRI v1 image API" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.662694 4739 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.669245 4739 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-08-20-51-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.669314 4739 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.700731 4739 manager.go:217] Machine: {Timestamp:2025-10-08 21:48:21.695781674 +0000 UTC m=+1.521167444 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e102a271-0593-45af-a6b3-5b473c7eebd3 BootID:06df6f1f-7503-4cee-a52c-383dcfb4609d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:3c:d6:af Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:3c:d6:af Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:1b:9e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:da:42:0b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:23:53:da Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:85:d9:85 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:3c:82:f1:e1:71 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:90:b3:99:d8:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.701003 4739 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.701275 4739 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.701844 4739 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.702075 4739 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.702124 4739 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.702398 4739 topology_manager.go:138] "Creating topology manager with none policy" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.702407 4739 container_manager_linux.go:303] "Creating device plugin manager" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.713885 4739 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.713926 4739 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.718107 4739 state_mem.go:36] "Initialized new in-memory state store" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.718385 4739 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.731241 4739 kubelet.go:418] "Attempting to sync node with API server" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.731351 4739 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.731496 4739 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.731517 4739 kubelet.go:324] "Adding apiserver pod source" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.731534 4739 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.739041 4739 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.741284 4739 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.743454 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.743575 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.743553 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.743669 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.743818 4739 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748663 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748685 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748693 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748699 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748711 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748718 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748726 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748738 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748746 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748754 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748766 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748774 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.748795 4739 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.749353 4739 server.go:1280] "Started kubelet" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.749446 4739 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.749516 4739 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.749696 4739 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.750277 4739 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.751078 4739 server.go:460] "Adding debug handlers to kubelet server" Oct 08 21:48:21 crc systemd[1]: Started Kubernetes Kubelet. Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.757920 4739 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758205 4739 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758271 4739 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758292 4739 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758271 4739 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 16:39:54.421866976 +0000 UTC Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758313 4739 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1050h51m32.663556857s for next certificate rotation Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.758372 4739 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.758775 4739 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.759350 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.759412 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.759509 4739 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.759585 4739 factory.go:55] Registering systemd factory Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.759598 4739 factory.go:221] Registration of the systemd container factory successfully Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.765493 4739 factory.go:153] Registering CRI-O factory Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.765769 4739 factory.go:221] Registration of the crio container factory successfully Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.765856 4739 factory.go:103] Registering Raw factory Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.765939 4739 manager.go:1196] Started watching for new ooms in manager Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.765664 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="200ms" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.766789 4739 manager.go:319] Starting recovery of all containers Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.775969 4739 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ca2634ba0d8e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 21:48:21.749315808 +0000 UTC m=+1.574701558,LastTimestamp:2025-10-08 21:48:21.749315808 +0000 UTC m=+1.574701558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781401 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781498 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781531 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781559 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781588 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781613 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781640 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781666 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781696 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781723 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781749 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781826 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781855 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.781887 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782087 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782119 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782211 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782243 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782269 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782296 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782323 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782350 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782377 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782447 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782488 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782521 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782553 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782583 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782616 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782647 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782679 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782786 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782819 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782846 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782874 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782902 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782927 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782952 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.782979 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783004 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783035 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783063 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783089 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783118 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783178 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783217 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783245 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783271 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783301 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783325 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783362 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783397 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783431 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783467 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783497 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783526 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783565 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783591 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783617 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783643 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783705 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783741 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783766 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783794 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783824 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783853 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783883 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783910 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783936 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783964 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.783994 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784019 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784043 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784068 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784095 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784118 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784141 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784201 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784225 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784251 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784279 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784301 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784327 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784351 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784376 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784400 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784424 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784450 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784484 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784512 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784539 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784565 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784594 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784754 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784795 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784821 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784852 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784878 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784908 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784936 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.784990 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785018 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785052 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785079 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785118 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785234 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785280 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785314 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785352 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785378 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785408 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785438 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785471 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785523 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785553 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785589 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785617 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785643 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785669 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785698 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785729 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785754 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785781 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785810 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785836 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785862 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785888 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785915 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785944 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785971 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.785997 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786023 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786052 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786076 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786100 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786128 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786241 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786273 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786301 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786326 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786357 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786386 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786412 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786439 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786466 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786491 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786517 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786545 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786574 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786603 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786630 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786660 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786691 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786719 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786746 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786774 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786803 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786832 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786860 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786886 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786918 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786946 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.786973 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787003 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787032 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787059 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787089 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787114 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787138 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787210 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787237 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787267 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787297 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787325 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787352 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787379 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787412 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787439 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787471 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787508 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787534 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787561 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787587 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787611 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787637 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787665 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787693 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787719 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787744 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787772 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787798 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787825 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787853 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787879 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787903 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787929 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787955 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.787978 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.788001 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.788025 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.790278 4739 manager.go:324] Recovery completed Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794392 4739 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794457 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794481 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794495 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794511 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794525 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794537 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794550 4739 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794562 4739 reconstruct.go:97] "Volume reconstruction finished" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.794569 4739 reconciler.go:26] "Reconciler: start to sync state" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.800776 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.802492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.802531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.802543 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.803256 4739 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.803275 4739 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.803298 4739 state_mem.go:36] "Initialized new in-memory state store" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.818840 4739 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.820397 4739 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.820433 4739 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.820459 4739 kubelet.go:2335] "Starting kubelet main sync loop" Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.820505 4739 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 08 21:48:21 crc kubenswrapper[4739]: W1008 21:48:21.834062 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.834167 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.835253 4739 policy_none.go:49] "None policy: Start" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.836699 4739 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 08 21:48:21 crc kubenswrapper[4739]: I1008 21:48:21.836725 4739 state_mem.go:35] "Initializing new in-memory state store" Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.859253 4739 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.921462 4739 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.959813 4739 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 21:48:21 crc kubenswrapper[4739]: E1008 21:48:21.968514 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="400ms" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.006464 4739 manager.go:334] "Starting Device Plugin manager" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.006654 4739 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.006722 4739 server.go:79] "Starting device plugin registration server" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.007254 4739 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.007338 4739 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.007522 4739 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.007755 4739 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.007771 4739 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.016181 4739 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.107884 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.110073 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.110226 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.110290 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.110375 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.111235 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.122311 4739 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.122508 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.123772 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.123830 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.123845 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.123978 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.124888 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.124948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.124960 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.139137 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.139249 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.139312 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.139461 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.139569 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141022 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141060 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141107 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141128 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141064 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141132 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141201 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141421 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141542 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.141665 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142850 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142882 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142891 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142899 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.142905 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.143053 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.143261 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.143298 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144040 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144074 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144086 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144633 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.144781 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.145406 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.145684 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.147326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.147359 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.147372 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199600 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199660 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199697 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199723 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199752 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199920 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.199990 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200030 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200068 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200139 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200219 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200256 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200285 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.200325 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301598 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301653 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301677 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301698 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301719 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301745 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301764 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301764 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301808 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301822 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301776 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301785 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301904 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301922 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301927 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301904 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301953 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301958 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.301967 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302017 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302037 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302062 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302076 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302100 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302081 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302131 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302181 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302184 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302225 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.302338 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.312349 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.313374 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.313402 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.313412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.313437 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.313734 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.369751 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="800ms" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.493238 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.500375 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.521913 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.539540 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.545356 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.550625 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-456ffb5b0f18224a6cbb92335f7d0ceb9503cd846e2ea952f237506b14eebfde WatchSource:0}: Error finding container 456ffb5b0f18224a6cbb92335f7d0ceb9503cd846e2ea952f237506b14eebfde: Status 404 returned error can't find the container with id 456ffb5b0f18224a6cbb92335f7d0ceb9503cd846e2ea952f237506b14eebfde Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.551452 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c397089ad3eb1beecd182c4d346507212164b75815eaac003c0f136ba281522c WatchSource:0}: Error finding container c397089ad3eb1beecd182c4d346507212164b75815eaac003c0f136ba281522c: Status 404 returned error can't find the container with id c397089ad3eb1beecd182c4d346507212164b75815eaac003c0f136ba281522c Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.559886 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4174d1ecc8a3f62d280fbbb10ef4679695bf7058eca6df9188acc6ea1106a5e9 WatchSource:0}: Error finding container 4174d1ecc8a3f62d280fbbb10ef4679695bf7058eca6df9188acc6ea1106a5e9: Status 404 returned error can't find the container with id 4174d1ecc8a3f62d280fbbb10ef4679695bf7058eca6df9188acc6ea1106a5e9 Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.566901 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.567029 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.567942 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e0ae5cb264eb74592ff95164e86f4b44d8662f7727a5012d0f77361420e75025 WatchSource:0}: Error finding container e0ae5cb264eb74592ff95164e86f4b44d8662f7727a5012d0f77361420e75025: Status 404 returned error can't find the container with id e0ae5cb264eb74592ff95164e86f4b44d8662f7727a5012d0f77361420e75025 Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.571741 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3193efcfec5d92688b39af28acf7cee60c1b441fe7e8141a160bddb5831b480e WatchSource:0}: Error finding container 3193efcfec5d92688b39af28acf7cee60c1b441fe7e8141a160bddb5831b480e: Status 404 returned error can't find the container with id 3193efcfec5d92688b39af28acf7cee60c1b441fe7e8141a160bddb5831b480e Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.574237 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.574329 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.713816 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.716535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.716579 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.716591 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.716618 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.716950 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.750201 4739 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.825188 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4174d1ecc8a3f62d280fbbb10ef4679695bf7058eca6df9188acc6ea1106a5e9"} Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.825930 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"456ffb5b0f18224a6cbb92335f7d0ceb9503cd846e2ea952f237506b14eebfde"} Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.826779 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c397089ad3eb1beecd182c4d346507212164b75815eaac003c0f136ba281522c"} Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.827615 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3193efcfec5d92688b39af28acf7cee60c1b441fe7e8141a160bddb5831b480e"} Oct 08 21:48:22 crc kubenswrapper[4739]: I1008 21:48:22.828347 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e0ae5cb264eb74592ff95164e86f4b44d8662f7727a5012d0f77361420e75025"} Oct 08 21:48:22 crc kubenswrapper[4739]: W1008 21:48:22.838209 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:22 crc kubenswrapper[4739]: E1008 21:48:22.838312 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:23 crc kubenswrapper[4739]: W1008 21:48:23.015763 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:23 crc kubenswrapper[4739]: E1008 21:48:23.015889 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:23 crc kubenswrapper[4739]: E1008 21:48:23.171225 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="1.6s" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.518007 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.519928 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.519965 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.519978 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.520005 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:23 crc kubenswrapper[4739]: E1008 21:48:23.520271 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Oct 08 21:48:23 crc kubenswrapper[4739]: E1008 21:48:23.658218 4739 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.5:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ca2634ba0d8e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 21:48:21.749315808 +0000 UTC m=+1.574701558,LastTimestamp:2025-10-08 21:48:21.749315808 +0000 UTC m=+1.574701558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.750858 4739 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.832379 4739 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67" exitCode=0 Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.832464 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.832571 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.834573 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.834625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.834639 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.836657 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.836701 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.838825 4739 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922" exitCode=0 Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.838912 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.839104 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.841758 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.841791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.841811 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.843976 4739 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6" exitCode=0 Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.844071 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.844117 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845208 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845706 4739 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9" exitCode=0 Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845718 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.845804 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.846236 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9"} Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.846970 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.846996 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.847007 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.847133 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.847242 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:23 crc kubenswrapper[4739]: I1008 21:48:23.847270 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:24 crc kubenswrapper[4739]: W1008 21:48:24.217023 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:24 crc kubenswrapper[4739]: E1008 21:48:24.217137 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.750796 4739 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:24 crc kubenswrapper[4739]: E1008 21:48:24.772133 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.5:6443: connect: connection refused" interval="3.2s" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.859648 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.859916 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.859997 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.860077 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.861700 4739 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281" exitCode=0 Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.861794 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.861869 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.862789 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.862831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.862842 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.864341 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.864368 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.864383 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.864523 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.866809 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.866857 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.867598 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.867766 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.867782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.869345 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.869373 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.869384 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.871266 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.871301 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b"} Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.871385 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.872769 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.872819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:24 crc kubenswrapper[4739]: I1008 21:48:24.872840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:24 crc kubenswrapper[4739]: W1008 21:48:24.953508 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:24 crc kubenswrapper[4739]: E1008 21:48:24.953605 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.121138 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.122685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.122731 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.122741 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.122772 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:25 crc kubenswrapper[4739]: E1008 21:48:25.123317 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.5:6443: connect: connection refused" node="crc" Oct 08 21:48:25 crc kubenswrapper[4739]: W1008 21:48:25.355802 4739 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.5:6443: connect: connection refused Oct 08 21:48:25 crc kubenswrapper[4739]: E1008 21:48:25.355899 4739 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.5:6443: connect: connection refused" logger="UnhandledError" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.878353 4739 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60" exitCode=0 Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.878472 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60"} Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.878542 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.880552 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.880826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.880938 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885492 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885540 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885559 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885577 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885542 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.885560 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b"} Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887407 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887450 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887468 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887578 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887641 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887666 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887886 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.887989 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.888049 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.888486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.888533 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:25 crc kubenswrapper[4739]: I1008 21:48:25.888550 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.623370 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892126 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed"} Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892220 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892255 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c"} Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892275 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051"} Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892300 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9"} Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892314 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad"} Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892236 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892337 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.892220 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893313 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893344 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893353 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893429 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893461 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893469 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893473 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893498 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:26 crc kubenswrapper[4739]: I1008 21:48:26.893597 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.149034 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.149409 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.150845 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.150893 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.150907 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.161627 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.895496 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.895588 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.895588 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897249 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897316 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897260 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897272 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897368 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897382 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897385 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:27 crc kubenswrapper[4739]: I1008 21:48:27.897337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.324022 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.326075 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.326134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.326175 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.326230 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.651433 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.816979 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.898231 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.898288 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.898564 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.899266 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.899319 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.899337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.899994 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.900030 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:28 crc kubenswrapper[4739]: I1008 21:48:28.900040 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.845598 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.900514 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.901706 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.901754 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.901768 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.969004 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.969294 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.970801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.970843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:29 crc kubenswrapper[4739]: I1008 21:48:29.970853 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.624658 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.624972 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.625051 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.626857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.626982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:30 crc kubenswrapper[4739]: I1008 21:48:30.627015 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.055913 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.056126 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.057526 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.057579 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.057591 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.674025 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.674274 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.675413 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.675446 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:31 crc kubenswrapper[4739]: I1008 21:48:31.675456 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:32 crc kubenswrapper[4739]: E1008 21:48:32.016935 4739 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 21:48:33 crc kubenswrapper[4739]: I1008 21:48:33.625236 4739 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 21:48:33 crc kubenswrapper[4739]: I1008 21:48:33.625319 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 21:48:35 crc kubenswrapper[4739]: I1008 21:48:35.360270 4739 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 21:48:35 crc kubenswrapper[4739]: I1008 21:48:35.360344 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 21:48:35 crc kubenswrapper[4739]: I1008 21:48:35.369159 4739 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 21:48:35 crc kubenswrapper[4739]: I1008 21:48:35.369239 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.661278 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.661482 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.662639 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.662717 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.662739 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.666799 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.930858 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.930937 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.932410 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.932486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:38 crc kubenswrapper[4739]: I1008 21:48:38.932507 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.364329 4739 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.366447 4739 trace.go:236] Trace[549032021]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 21:48:28.575) (total time: 11791ms): Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[549032021]: ---"Objects listed" error: 11791ms (21:48:40.366) Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[549032021]: [11.791298472s] [11.791298472s] END Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.366498 4739 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.367516 4739 trace.go:236] Trace[2117186042]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 21:48:26.119) (total time: 14247ms): Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[2117186042]: ---"Objects listed" error: 14247ms (21:48:40.367) Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[2117186042]: [14.247514682s] [14.247514682s] END Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.367544 4739 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.368272 4739 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.368284 4739 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.368438 4739 trace.go:236] Trace[2106617619]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 21:48:28.558) (total time: 11810ms): Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[2106617619]: ---"Objects listed" error: 11809ms (21:48:40.368) Oct 08 21:48:40 crc kubenswrapper[4739]: Trace[2106617619]: [11.810004728s] [11.810004728s] END Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.368455 4739 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.370448 4739 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.403297 4739 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37954->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.403358 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37954->192.168.126.11:17697: read: connection reset by peer" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.404164 4739 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.404240 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.745206 4739 apiserver.go:52] "Watching apiserver" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.772195 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.776550 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.778733 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.788251 4739 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.788665 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.789605 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.789693 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.789801 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.790042 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.790123 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.790131 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.790469 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.790477 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.790633 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.792593 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.793032 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.793971 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.794019 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.793974 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.794128 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.794171 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.794263 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.794383 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.800458 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.822402 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.837290 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.846363 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.854233 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.859571 4739 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870731 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870765 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870787 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870807 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870827 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870847 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870869 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870889 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870913 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870930 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870944 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.870964 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871034 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871051 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871066 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871082 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871077 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871113 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871131 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871124 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871168 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871248 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871283 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871310 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871340 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871361 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871384 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871406 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871427 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871448 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871468 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871489 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871509 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871530 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871550 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871571 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871597 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871618 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.871645 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872035 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872172 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872675 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872710 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872747 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872774 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872767 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872795 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872824 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872851 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872870 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.872994 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.873403 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.873572 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.873617 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.873942 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874058 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874097 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874174 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874352 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874419 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874465 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874696 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874713 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874837 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.874998 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875213 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875240 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875351 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.873224 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875411 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875435 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875458 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875488 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875522 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875517 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875549 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875578 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875608 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875630 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875648 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875663 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875680 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875698 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875714 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875733 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875752 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875860 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875910 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875934 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875978 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876367 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876777 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876802 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876822 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876842 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876861 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875546 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875732 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875808 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875801 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.875915 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876388 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876414 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876433 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876440 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876581 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876765 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877076 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877494 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877490 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877512 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877715 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877819 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877992 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878074 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.877702 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.876782 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878729 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878767 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878792 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878813 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878836 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878908 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.878955 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879063 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879091 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879200 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879253 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879273 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879280 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879321 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879342 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879346 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879362 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879372 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879382 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879520 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879548 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879569 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879587 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879625 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879877 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879899 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879916 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879932 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879948 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879966 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.879983 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880000 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880019 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880036 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880054 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880092 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880158 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880227 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880570 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880589 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880690 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880867 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880966 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880988 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881118 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881276 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881347 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881387 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.880073 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881435 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881453 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881470 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881499 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881597 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881631 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881662 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881678 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881701 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881718 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881735 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881858 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.881968 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882112 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882192 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882208 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882225 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882240 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882277 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882295 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882311 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882326 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882346 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882367 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882381 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882409 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882427 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882447 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882444 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882464 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882607 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882663 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882703 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882714 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882765 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882823 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882872 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882921 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882968 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883020 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883069 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883122 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883227 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883278 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883333 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883523 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883622 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883821 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883888 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883938 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883986 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884040 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884092 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884190 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884242 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884288 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884340 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884391 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884483 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884538 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884588 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884637 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884688 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884739 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884794 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884847 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884896 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884948 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884998 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885047 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885093 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885189 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885242 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885302 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885357 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885405 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885459 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885514 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885563 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885611 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885664 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885716 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885764 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885817 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885870 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885922 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885972 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886024 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886076 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886178 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886238 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882715 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.882961 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883214 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883425 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883508 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886723 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883553 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883568 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883803 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883840 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883861 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883874 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883894 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883905 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883936 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887125 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.883945 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884023 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887513 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884243 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884344 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884577 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884578 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884952 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.884981 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885117 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885127 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885208 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888262 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885251 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885267 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885314 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885563 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.885821 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886083 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.886256 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:48:41.386236175 +0000 UTC m=+21.211622025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888363 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888364 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888396 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888418 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888501 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888522 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888530 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888540 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888557 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888576 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888594 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888613 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888630 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888649 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888650 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888691 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888718 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888739 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888757 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888753 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888780 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888791 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888796 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886417 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886417 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886589 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886842 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886875 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886888 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886897 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887055 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888920 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887238 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887272 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887452 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887381 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887937 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.887978 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889002 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.886061 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889124 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889193 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.888801 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889332 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889360 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889374 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889383 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889454 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889382 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889525 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889910 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.889475 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.890550 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889484 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889498 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889525 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889597 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889636 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.889848 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.890639 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:41.390618074 +0000 UTC m=+21.216003824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.890403 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.890415 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.890588 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.890660 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.890842 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:41.390833779 +0000 UTC m=+21.216219529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.890970 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891068 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891198 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891218 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891278 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891288 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891479 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891529 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891774 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891791 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891800 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891810 4739 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891821 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891831 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891840 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891850 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891859 4739 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891868 4739 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.891904 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.892639 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.892752 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.892824 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.892932 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.892952 4739 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.893175 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.893910 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.893996 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.893988 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.894028 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.894049 4739 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.894048 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.894064 4739 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895528 4739 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895541 4739 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895552 4739 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895562 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895574 4739 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895584 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895605 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895615 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895624 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895634 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895644 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895653 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895663 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895684 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895694 4739 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895702 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895711 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895720 4739 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895445 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895729 4739 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895773 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895786 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895799 4739 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895808 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895818 4739 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895648 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895662 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895771 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895830 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895865 4739 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895874 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895883 4739 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895891 4739 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895899 4739 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895908 4739 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895919 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895928 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895936 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895945 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895953 4739 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895962 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895971 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895979 4739 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895989 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.895998 4739 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896008 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896016 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896043 4739 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896052 4739 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896061 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896071 4739 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896079 4739 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896087 4739 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896096 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896106 4739 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896114 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896122 4739 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896130 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896153 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896179 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896188 4739 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896196 4739 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896205 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896213 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896221 4739 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896230 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896239 4739 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896249 4739 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896257 4739 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896266 4739 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896274 4739 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896282 4739 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896291 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896299 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896307 4739 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896316 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896325 4739 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896335 4739 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896344 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896354 4739 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896362 4739 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896370 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896378 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896386 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896395 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896406 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896415 4739 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896423 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896431 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896440 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896450 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896479 4739 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896488 4739 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896496 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896504 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896512 4739 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896520 4739 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896528 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896537 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896546 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896555 4739 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896564 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896573 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896580 4739 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896588 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896596 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896604 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896613 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896621 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896629 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896638 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896646 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896654 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896662 4739 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896669 4739 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896679 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896687 4739 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896695 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896703 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896711 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896719 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896726 4739 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896737 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896752 4739 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896760 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896767 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896775 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896785 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896793 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896801 4739 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896810 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896819 4739 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896826 4739 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896834 4739 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896842 4739 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.896850 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.901738 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.901829 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.901915 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.901947 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.902554 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.902889 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903370 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903389 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903453 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903502 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903709 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.903993 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.904015 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.904132 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905357 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905614 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905787 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905852 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905899 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.905976 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906053 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906051 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906085 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906087 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906378 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906486 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.906613 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.907245 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.907283 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.907301 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.907385 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.907391 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:41.407365826 +0000 UTC m=+21.232751596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.909946 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.909969 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.909983 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.910042 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:41.410023482 +0000 UTC m=+21.235409232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.914942 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.916136 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: E1008 21:48:40.940681 4739 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998304 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998390 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998456 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998521 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998545 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998566 4739 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998584 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998601 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998618 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998517 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998640 4739 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998657 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998675 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998692 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998709 4739 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998726 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998745 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998762 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998779 4739 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998796 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998813 4739 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998830 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998846 4739 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998863 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998879 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998895 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998912 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998928 4739 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998945 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998961 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:40 crc kubenswrapper[4739]: I1008 21:48:40.998979 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.998995 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999012 4739 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999028 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999044 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999061 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999080 4739 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999096 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999113 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999129 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999175 4739 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999192 4739 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999208 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999224 4739 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999240 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999258 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999275 4739 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999295 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999312 4739 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999329 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:40.999345 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.008777 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.010513 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.015434 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.030550 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.042877 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.054867 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.056525 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.059478 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.100365 4739 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.100704 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.110503 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.120457 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 21:48:41 crc kubenswrapper[4739]: W1008 21:48:41.123129 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2eccaff5876586a91d9a2390c38c89e4c5297293aea9485863e9fdd5ddc2c550 WatchSource:0}: Error finding container 2eccaff5876586a91d9a2390c38c89e4c5297293aea9485863e9fdd5ddc2c550: Status 404 returned error can't find the container with id 2eccaff5876586a91d9a2390c38c89e4c5297293aea9485863e9fdd5ddc2c550 Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.130062 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 21:48:41 crc kubenswrapper[4739]: W1008 21:48:41.130625 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-ce70afa55d95e6c5719e43c6f8ada173bda8711e2bf5984d6d86c00bef610617 WatchSource:0}: Error finding container ce70afa55d95e6c5719e43c6f8ada173bda8711e2bf5984d6d86c00bef610617: Status 404 returned error can't find the container with id ce70afa55d95e6c5719e43c6f8ada173bda8711e2bf5984d6d86c00bef610617 Oct 08 21:48:41 crc kubenswrapper[4739]: W1008 21:48:41.189655 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6a45a9141c1fb1019c82e1ebd0b84b92beb7a90f3cdf89ea7209b50ab8df4835 WatchSource:0}: Error finding container 6a45a9141c1fb1019c82e1ebd0b84b92beb7a90f3cdf89ea7209b50ab8df4835: Status 404 returned error can't find the container with id 6a45a9141c1fb1019c82e1ebd0b84b92beb7a90f3cdf89ea7209b50ab8df4835 Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.290219 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.317088 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.319164 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.319438 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.341691 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.368279 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.387702 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.399454 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.403340 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.403393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.403419 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.403483 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:48:42.403464937 +0000 UTC m=+22.228850687 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.403484 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.403541 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:42.403532419 +0000 UTC m=+22.228918169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.403602 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.403735 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:42.403682022 +0000 UTC m=+22.229067822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.410534 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.423746 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.445454 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.454926 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.464463 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.473797 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.483095 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.491019 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.500317 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.503983 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.504039 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504174 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504193 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504204 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504229 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504253 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504268 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504253 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:42.504240454 +0000 UTC m=+22.329626204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:41 crc kubenswrapper[4739]: E1008 21:48:41.504339 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:42.504319386 +0000 UTC m=+22.329705146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.513290 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.810713 4739 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.810766 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.825386 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.825860 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.826624 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.827386 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.828068 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.828538 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.829219 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.829733 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.830310 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.830806 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.831280 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.831879 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.832452 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.832972 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.833518 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.833595 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.834122 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.834761 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.836001 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.837483 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.838434 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.839116 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.839887 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.840505 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.841529 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.842105 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.842951 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.843869 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.843983 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.844539 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.845351 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.845984 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.848500 4739 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.848823 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.853346 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.854745 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.857395 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.860937 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.862029 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.863402 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.863750 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.864878 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.866784 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.867808 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.869089 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.870627 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.872175 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.872911 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.873746 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.875048 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.875422 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.876281 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.877708 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.878369 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.879580 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.880342 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.881214 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.882487 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.885966 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.906927 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.922304 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.932964 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.940482 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ce70afa55d95e6c5719e43c6f8ada173bda8711e2bf5984d6d86c00bef610617"} Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.941759 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2eccaff5876586a91d9a2390c38c89e4c5297293aea9485863e9fdd5ddc2c550"} Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.944684 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.947710 4739 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b" exitCode=255 Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.947777 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b"} Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.950257 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6a45a9141c1fb1019c82e1ebd0b84b92beb7a90f3cdf89ea7209b50ab8df4835"} Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.962056 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.962848 4739 scope.go:117] "RemoveContainer" containerID="6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.969320 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.982598 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:41 crc kubenswrapper[4739]: I1008 21:48:41.993359 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.006567 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.020468 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.032805 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.045701 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.054505 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.411559 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.411824 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:48:44.411790046 +0000 UTC m=+24.237175806 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.411913 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.411947 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.412034 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.412050 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.412085 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:44.412071263 +0000 UTC m=+24.237457013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.412116 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:44.412102193 +0000 UTC m=+24.237488033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.512335 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.512428 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512552 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512572 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512586 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512634 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:44.512619354 +0000 UTC m=+24.338005124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512932 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512965 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.512978 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.513040 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:44.513021213 +0000 UTC m=+24.338406963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.821060 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.821108 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.821196 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.821368 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.821508 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:42 crc kubenswrapper[4739]: E1008 21:48:42.821612 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.955294 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.958565 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879"} Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.959210 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.960063 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a"} Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.962934 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8"} Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.962958 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c"} Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.982428 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:42 crc kubenswrapper[4739]: I1008 21:48:42.999573 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.014370 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.047723 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.061085 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.075278 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.088201 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.105686 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.127105 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.147772 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.166318 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.181483 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.197887 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.227551 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.249768 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.261673 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.277607 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:43 crc kubenswrapper[4739]: I1008 21:48:43.291713 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.430040 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.430215 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.430246 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:48:48.430215188 +0000 UTC m=+28.255600948 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.430365 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.430373 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.430447 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:48.430427633 +0000 UTC m=+28.255813403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.430459 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.430543 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:48.430533566 +0000 UTC m=+28.255919326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.530924 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.531011 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531122 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531170 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531142 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531220 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531247 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531184 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531340 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:48.531312602 +0000 UTC m=+28.356698392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.531375 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:48.531356853 +0000 UTC m=+28.356742643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.820715 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.820760 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.820764 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.820916 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.821090 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:44 crc kubenswrapper[4739]: E1008 21:48:44.821256 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.970255 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da"} Oct 08 21:48:44 crc kubenswrapper[4739]: I1008 21:48:44.999277 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:44Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.015665 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.032550 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.054482 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.073132 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.091383 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.126407 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.158452 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:45 crc kubenswrapper[4739]: I1008 21:48:45.174640 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:45Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.770966 4739 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.773343 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.773396 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.773416 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.773514 4739 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.783813 4739 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.784190 4739 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.785626 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.785672 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.785685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.785704 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.785790 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.820520 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:46Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.821128 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.821197 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.821289 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.821297 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.821424 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.821524 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.826486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.826528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.826539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.826556 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.826570 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.851088 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:46Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.854492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.854547 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.854558 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.854586 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.854598 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.881356 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:46Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.889333 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.889376 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.889387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.889401 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.889411 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.901673 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:46Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.909491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.909526 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.909535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.909550 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.909560 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.957272 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:46Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:46 crc kubenswrapper[4739]: E1008 21:48:46.957382 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.958789 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.958807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.958815 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.958828 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:46 crc kubenswrapper[4739]: I1008 21:48:46.958837 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:46Z","lastTransitionTime":"2025-10-08T21:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.061865 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.061891 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.061899 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.061910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.061920 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.165021 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.165060 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.165071 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.165086 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.165097 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.267080 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.267111 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.267118 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.267130 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.267138 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.369236 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.369270 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.369280 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.369295 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.369306 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.472326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.472363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.472372 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.472411 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.472424 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.574618 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.574650 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.574660 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.574676 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.574686 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.676416 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.676452 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.676463 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.676479 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.676488 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.700977 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jh2pw"] Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.701325 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.703203 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.703337 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.703515 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.730478 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.747882 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.760036 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b697a648-053d-4e99-97a9-620dd8397aaf-hosts-file\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.760262 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxhfr\" (UniqueName: \"kubernetes.io/projected/b697a648-053d-4e99-97a9-620dd8397aaf-kube-api-access-pxhfr\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.761192 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.774490 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.778918 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.778949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.778975 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.778992 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.779002 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.789234 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.812996 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.832533 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.846258 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.861733 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxhfr\" (UniqueName: \"kubernetes.io/projected/b697a648-053d-4e99-97a9-620dd8397aaf-kube-api-access-pxhfr\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.861773 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b697a648-053d-4e99-97a9-620dd8397aaf-hosts-file\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.861843 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b697a648-053d-4e99-97a9-620dd8397aaf-hosts-file\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.861995 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.876936 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880194 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxhfr\" (UniqueName: \"kubernetes.io/projected/b697a648-053d-4e99-97a9-620dd8397aaf-kube-api-access-pxhfr\") pod \"node-resolver-jh2pw\" (UID: \"b697a648-053d-4e99-97a9-620dd8397aaf\") " pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880829 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880854 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880879 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.880891 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.982253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.982296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.982308 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.982326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:47 crc kubenswrapper[4739]: I1008 21:48:47.982338 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:47Z","lastTransitionTime":"2025-10-08T21:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.018807 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jh2pw" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.074999 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dwvs2"] Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.075383 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.075818 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hjvjs"] Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.076339 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wwt88"] Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.076419 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.076541 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.078253 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.078867 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.079092 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.080120 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.080800 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.081220 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.081639 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.081682 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.081920 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.082079 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hfhrc"] Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.082124 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.082410 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.083004 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.083854 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.084790 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.084821 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.085044 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.085069 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.085780 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.085817 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.085836 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.085818 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.085884 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.085914 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.085978 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.085993 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.086544 4739 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.086566 4739 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.089553 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.090080 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.090172 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.090185 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.090202 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.090214 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.101294 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.114106 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.126372 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.147578 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.160044 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.164490 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-system-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.164540 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-socket-dir-parent\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.164790 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-bin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.164931 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-hostroot\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.164957 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165125 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165190 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-netns\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165210 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-conf-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165439 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-os-release\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165544 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165596 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165616 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165846 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9707b708-016c-4e06-86db-0332e2ca37db-rootfs\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.165874 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9707b708-016c-4e06-86db-0332e2ca37db-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171056 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-system-cni-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171136 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cni-binary-copy\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171195 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171224 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171260 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45972\" (UniqueName: \"kubernetes.io/projected/9707b708-016c-4e06-86db-0332e2ca37db-kube-api-access-45972\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171430 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171502 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9707b708-016c-4e06-86db-0332e2ca37db-proxy-tls\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171541 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.171573 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-multus\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173450 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173514 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173537 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173616 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173676 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173707 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173728 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-etc-kubernetes\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173884 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.173984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-daemon-config\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174029 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174066 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174138 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174246 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174279 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174309 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-os-release\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174341 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkj9s\" (UniqueName: \"kubernetes.io/projected/e6074d7a-f433-42bf-8c80-71963ba57484-kube-api-access-xkj9s\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174369 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-k8s-cni-cncf-io\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174395 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-kubelet\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174422 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-multus-certs\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174449 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95gzq\" (UniqueName: \"kubernetes.io/projected/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-kube-api-access-95gzq\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174470 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-cnibin\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174495 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174533 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cnibin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174558 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.174583 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.186529 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.192830 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.192860 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.192869 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.192883 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.192893 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.199124 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.210883 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.220900 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.237935 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.250976 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.271984 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275482 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275531 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275557 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45972\" (UniqueName: \"kubernetes.io/projected/9707b708-016c-4e06-86db-0332e2ca37db-kube-api-access-45972\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275595 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cni-binary-copy\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275606 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275606 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275616 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275664 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275678 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9707b708-016c-4e06-86db-0332e2ca37db-proxy-tls\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275700 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-multus\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275717 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275735 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275750 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275766 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275780 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-etc-kubernetes\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275790 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275794 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275810 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275824 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275840 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275867 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275884 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275898 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275918 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275932 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-daemon-config\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275947 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275961 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275976 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-os-release\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.275990 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkj9s\" (UniqueName: \"kubernetes.io/projected/e6074d7a-f433-42bf-8c80-71963ba57484-kube-api-access-xkj9s\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276018 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-kubelet\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276032 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-multus-certs\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276046 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95gzq\" (UniqueName: \"kubernetes.io/projected/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-kube-api-access-95gzq\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276063 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-cnibin\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-k8s-cni-cncf-io\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276099 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276121 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cnibin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276135 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276179 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276200 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-bin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276217 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-hostroot\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276234 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276249 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276266 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-system-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276281 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-socket-dir-parent\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276296 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-os-release\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276308 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-etc-kubernetes\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276311 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-netns\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276386 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-multus\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-conf-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276411 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276418 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9707b708-016c-4e06-86db-0332e2ca37db-rootfs\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276438 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276442 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9707b708-016c-4e06-86db-0332e2ca37db-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276459 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276465 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276489 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276503 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cni-binary-copy\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276511 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276538 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-system-cni-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276614 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276614 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-system-cni-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276644 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276369 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-netns\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276672 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-conf-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276680 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9707b708-016c-4e06-86db-0332e2ca37db-rootfs\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276746 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276855 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276918 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-kubelet\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276930 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-cnibin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276963 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.276944 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277003 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-os-release\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277028 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277085 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277119 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6074d7a-f433-42bf-8c80-71963ba57484-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277133 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-var-lib-cni-bin\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277193 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-multus-certs\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277196 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-hostroot\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277217 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277254 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-socket-dir-parent\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277265 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9707b708-016c-4e06-86db-0332e2ca37db-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277289 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-os-release\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277307 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-system-cni-dir\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277327 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-cnibin\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277315 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-host-run-k8s-cni-cncf-io\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.277481 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-multus-daemon-config\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.279490 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9707b708-016c-4e06-86db-0332e2ca37db-proxy-tls\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.294896 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkj9s\" (UniqueName: \"kubernetes.io/projected/e6074d7a-f433-42bf-8c80-71963ba57484-kube-api-access-xkj9s\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.295194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.295244 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.295254 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.295267 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.295278 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.300655 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45972\" (UniqueName: \"kubernetes.io/projected/9707b708-016c-4e06-86db-0332e2ca37db-kube-api-access-45972\") pod \"machine-config-daemon-dwvs2\" (UID: \"9707b708-016c-4e06-86db-0332e2ca37db\") " pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.301177 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.303699 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95gzq\" (UniqueName: \"kubernetes.io/projected/17ed1d5a-5f21-4dcf-bdb9-09e715f57027-kube-api-access-95gzq\") pod \"multus-wwt88\" (UID: \"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\") " pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.313761 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.330463 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.345831 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.358099 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.371491 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.383405 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.394938 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6074d7a-f433-42bf-8c80-71963ba57484-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjvjs\" (UID: \"e6074d7a-f433-42bf-8c80-71963ba57484\") " pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.395574 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.398261 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.398284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.398292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.398306 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.398315 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.400462 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.407506 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.411961 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9707b708_016c_4e06_86db_0332e2ca37db.slice/crio-68bf2806075b0bc284ee764f749146e8029ad9e8206084f392a8d05b58e321b0 WatchSource:0}: Error finding container 68bf2806075b0bc284ee764f749146e8029ad9e8206084f392a8d05b58e321b0: Status 404 returned error can't find the container with id 68bf2806075b0bc284ee764f749146e8029ad9e8206084f392a8d05b58e321b0 Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.413362 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.418017 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wwt88" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.423244 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6074d7a_f433_42bf_8c80_71963ba57484.slice/crio-2ac8dee77b98927e23bf4591364243fd33d6b1ffd6d6f9b5e1caac7203b95002 WatchSource:0}: Error finding container 2ac8dee77b98927e23bf4591364243fd33d6b1ffd6d6f9b5e1caac7203b95002: Status 404 returned error can't find the container with id 2ac8dee77b98927e23bf4591364243fd33d6b1ffd6d6f9b5e1caac7203b95002 Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.427572 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.440763 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.453690 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:48 crc kubenswrapper[4739]: W1008 21:48:48.469419 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17ed1d5a_5f21_4dcf_bdb9_09e715f57027.slice/crio-db89061fd381ed6997e448be675e35515ab673ed27d70bd3b3fd3b92df57d3ef WatchSource:0}: Error finding container db89061fd381ed6997e448be675e35515ab673ed27d70bd3b3fd3b92df57d3ef: Status 404 returned error can't find the container with id db89061fd381ed6997e448be675e35515ab673ed27d70bd3b3fd3b92df57d3ef Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.478405 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.478570 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:48:56.478543404 +0000 UTC m=+36.303929154 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.478727 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.478865 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.478885 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.479129 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:56.479109228 +0000 UTC m=+36.304494988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.479018 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.479661 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:56.479648801 +0000 UTC m=+36.305034641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.504431 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.504497 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.504510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.504527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.504537 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.579578 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.579673 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579758 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579787 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579797 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579845 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:56.579828313 +0000 UTC m=+36.405214063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579893 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579920 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.579938 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.580001 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:56.579980136 +0000 UTC m=+36.405365926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.611556 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.611593 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.611603 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.611618 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.611628 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.714594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.715000 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.715012 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.715030 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.715042 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.817674 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.817711 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.817719 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.817735 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.817745 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.820604 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.820685 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.820719 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.820604 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.820817 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:48 crc kubenswrapper[4739]: E1008 21:48:48.820897 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.920222 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.920270 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.920284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.920301 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.920313 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:48Z","lastTransitionTime":"2025-10-08T21:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.981748 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jh2pw" event={"ID":"b697a648-053d-4e99-97a9-620dd8397aaf","Type":"ContainerStarted","Data":"55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.981790 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jh2pw" event={"ID":"b697a648-053d-4e99-97a9-620dd8397aaf","Type":"ContainerStarted","Data":"e9d4c256f213b20f6d4532aa606b8177f757734bd7707b62b4a2aa03bb3f7830"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.983885 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerStarted","Data":"d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.983919 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerStarted","Data":"db89061fd381ed6997e448be675e35515ab673ed27d70bd3b3fd3b92df57d3ef"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.985642 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.985707 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"2ac8dee77b98927e23bf4591364243fd33d6b1ffd6d6f9b5e1caac7203b95002"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.987320 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.989476 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.989542 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"68bf2806075b0bc284ee764f749146e8029ad9e8206084f392a8d05b58e321b0"} Oct 08 21:48:48 crc kubenswrapper[4739]: I1008 21:48:48.999383 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.007378 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.023448 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.023492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.023508 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.023528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.023541 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.026421 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.063182 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.093161 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.117804 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.125784 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.125836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.125845 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.125863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.125878 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.134417 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.145891 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.157080 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.166981 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.182531 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.190240 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.198943 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.211668 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.224839 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.228221 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.228290 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.228316 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.228341 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.228355 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.237512 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.249392 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.264132 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277383 4739 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277430 4739 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277510 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides podName:4c6641d9-9ccf-42aa-8a83-c52d850aa766 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:49.777483897 +0000 UTC m=+29.602869657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides") pod "ovnkube-node-hfhrc" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277538 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib podName:4c6641d9-9ccf-42aa-8a83-c52d850aa766 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:49.777527598 +0000 UTC m=+29.602913368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib") pod "ovnkube-node-hfhrc" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277625 4739 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.277743 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert podName:4c6641d9-9ccf-42aa-8a83-c52d850aa766 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:49.777710892 +0000 UTC m=+29.603096652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert") pod "ovnkube-node-hfhrc" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766") : failed to sync secret cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.281683 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.295759 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.301290 4739 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.301390 4739 projected.go:194] Error preparing data for projected volume kube-api-access-rrwfj for pod openshift-ovn-kubernetes/ovnkube-node-hfhrc: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: E1008 21:48:49.301503 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj podName:4c6641d9-9ccf-42aa-8a83-c52d850aa766 nodeName:}" failed. No retries permitted until 2025-10-08 21:48:49.801460988 +0000 UTC m=+29.626846738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rrwfj" (UniqueName: "kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj") pod "ovnkube-node-hfhrc" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.313395 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.330480 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.331541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.331580 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.331592 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.331612 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.331626 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.344248 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.360107 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.365738 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.372204 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.373200 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.391993 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.395778 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.413185 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.434744 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.434810 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.434822 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.434845 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.434860 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.437563 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.453772 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.471884 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.488700 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:49Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.537461 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.537537 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.537552 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.537576 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.537595 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.638712 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.639706 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.639763 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.639775 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.639799 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.639812 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.742314 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.742376 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.742387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.742410 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.742426 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.797438 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.797487 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.797559 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.798136 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.798457 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.806708 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.844073 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.844614 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.844715 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.844808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.844888 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.898734 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.902511 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") pod \"ovnkube-node-hfhrc\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.925040 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.946565 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.946607 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.946622 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.946642 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.946657 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:49Z","lastTransitionTime":"2025-10-08T21:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.994328 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed" exitCode=0 Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.994416 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.996270 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4"} Oct 08 21:48:49 crc kubenswrapper[4739]: I1008 21:48:49.998222 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"5ac3717ac6722e968798dbac2846896abeebcd93e652c4a6f093503b0c023137"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.012996 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.034002 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.049024 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.049076 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.049089 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.049109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.049123 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.060817 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.076136 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.093723 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.110771 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.122338 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.136223 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.148329 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.152255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.152312 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.152323 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.152353 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.152368 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.163887 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.177471 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.192403 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.206506 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.221262 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.238359 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.255982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.256020 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.256030 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.256048 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.256062 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.262560 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.283600 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.302081 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.323607 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.339404 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.358791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.358829 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.358840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.358859 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.358869 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.360317 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.374684 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.388686 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.403406 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.414822 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.429511 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.445188 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.461728 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.461791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.462003 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.462031 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.462048 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.462832 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:50Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.565216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.565284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.565303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.565330 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.565352 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.668776 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.668844 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.668857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.668881 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.668951 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.773072 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.773136 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.773169 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.773195 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.773212 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.821005 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.821114 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:50 crc kubenswrapper[4739]: E1008 21:48:50.821193 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.821296 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:50 crc kubenswrapper[4739]: E1008 21:48:50.821325 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:50 crc kubenswrapper[4739]: E1008 21:48:50.821548 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.876125 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.876217 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.876234 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.876260 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.876277 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.978987 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.979039 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.979053 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.979076 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:50 crc kubenswrapper[4739]: I1008 21:48:50.979093 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:50Z","lastTransitionTime":"2025-10-08T21:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.005683 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.009830 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" exitCode=0 Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.010465 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.037393 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.072121 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.081404 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.081493 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.081508 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.081531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.081546 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.087696 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.113938 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.131586 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.153241 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.169861 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.184870 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.184935 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.184953 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.184980 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.184999 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.193616 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.214358 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.232919 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.246663 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.259093 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.272880 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.287471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.287538 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.287549 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.287570 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.287587 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.291455 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.309349 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.335820 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.358340 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.378780 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.389951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.390020 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.390029 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.390045 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.390057 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.394550 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.412802 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.430377 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.450761 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.465421 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.481098 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.493921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.493962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.493971 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.493986 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.493996 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.498112 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.518671 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.537111 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.560791 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.598971 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.599014 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.599023 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.599057 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.599067 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.701871 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.701915 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.701928 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.701949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.701961 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.806972 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.807009 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.807073 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.807096 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.807106 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.828105 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8p5bp"] Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.828586 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.830275 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.831275 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.831848 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.834787 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.845001 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.870457 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.891027 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.907525 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.910992 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.911029 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.911036 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.911050 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.911063 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:51Z","lastTransitionTime":"2025-10-08T21:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.922604 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3d9049-8f29-4235-8d91-e565cb0d157c-host\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.922675 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da3d9049-8f29-4235-8d91-e565cb0d157c-serviceca\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.922762 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nhxr\" (UniqueName: \"kubernetes.io/projected/da3d9049-8f29-4235-8d91-e565cb0d157c-kube-api-access-5nhxr\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.929993 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.950541 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.967921 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:51 crc kubenswrapper[4739]: I1008 21:48:51.996060 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.013985 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.014352 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.014561 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.014641 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.014706 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.021058 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.021125 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.023460 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da3d9049-8f29-4235-8d91-e565cb0d157c-serviceca\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.023588 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhxr\" (UniqueName: \"kubernetes.io/projected/da3d9049-8f29-4235-8d91-e565cb0d157c-kube-api-access-5nhxr\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.023613 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3d9049-8f29-4235-8d91-e565cb0d157c-host\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.023696 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da3d9049-8f29-4235-8d91-e565cb0d157c-host\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.025183 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f" exitCode=0 Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.025249 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.025635 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da3d9049-8f29-4235-8d91-e565cb0d157c-serviceca\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.027750 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.065990 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.081337 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nhxr\" (UniqueName: \"kubernetes.io/projected/da3d9049-8f29-4235-8d91-e565cb0d157c-kube-api-access-5nhxr\") pod \"node-ca-8p5bp\" (UID: \"da3d9049-8f29-4235-8d91-e565cb0d157c\") " pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.099594 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.119023 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.119177 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.119271 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.119387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.119472 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.120301 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.136130 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.154263 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.167538 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.181878 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.202647 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.218225 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.222339 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.222375 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.222450 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.222473 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.222489 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.233318 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.247284 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.264117 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.279843 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.293550 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.306632 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8p5bp" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.314987 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: W1008 21:48:52.322818 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3d9049_8f29_4235_8d91_e565cb0d157c.slice/crio-6d087287f6aefa317952535f172bee976704fa636ac2cfdf35001753c9534a18 WatchSource:0}: Error finding container 6d087287f6aefa317952535f172bee976704fa636ac2cfdf35001753c9534a18: Status 404 returned error can't find the container with id 6d087287f6aefa317952535f172bee976704fa636ac2cfdf35001753c9534a18 Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.324749 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.324782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.324792 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.324807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.324816 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.334783 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.349365 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.358896 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.378002 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.413796 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.430978 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.431023 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.431040 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.431063 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.431078 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.534602 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.535080 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.535091 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.535109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.535159 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.640390 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.640435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.640447 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.640464 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.640476 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.743375 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.743421 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.743432 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.743448 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.743460 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.820972 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.821053 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:52 crc kubenswrapper[4739]: E1008 21:48:52.821222 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.821318 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:52 crc kubenswrapper[4739]: E1008 21:48:52.821407 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:52 crc kubenswrapper[4739]: E1008 21:48:52.821610 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.847058 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.847110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.847126 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.847177 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.847196 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.952599 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.952655 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.952672 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.952697 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:52 crc kubenswrapper[4739]: I1008 21:48:52.952714 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:52Z","lastTransitionTime":"2025-10-08T21:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.032035 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8p5bp" event={"ID":"da3d9049-8f29-4235-8d91-e565cb0d157c","Type":"ContainerStarted","Data":"6d087287f6aefa317952535f172bee976704fa636ac2cfdf35001753c9534a18"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.035845 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.055800 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.055863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.055885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.055921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.055947 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.158427 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.158470 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.158480 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.158497 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.158509 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.261441 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.261501 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.261512 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.261531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.261542 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.364310 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.364338 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.364346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.364359 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.364367 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.466225 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.466489 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.466566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.466663 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.466718 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.569848 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.569893 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.569904 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.569922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.569934 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.673493 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.673537 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.673548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.673565 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.673578 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.778875 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.779328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.779349 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.779372 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.779388 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.881613 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.881673 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.881692 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.881720 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.881748 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.984463 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.984527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.984543 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.984569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:53 crc kubenswrapper[4739]: I1008 21:48:53.984585 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:53Z","lastTransitionTime":"2025-10-08T21:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.041883 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8p5bp" event={"ID":"da3d9049-8f29-4235-8d91-e565cb0d157c","Type":"ContainerStarted","Data":"392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.045904 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4" exitCode=0 Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.046001 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.049496 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.049529 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.067439 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.084531 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.087566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.087619 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.087630 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.087644 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.087672 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.101973 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.116211 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.127948 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.142571 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.159952 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.175204 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.183992 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.188893 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.191559 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.191598 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.191610 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.191628 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.191642 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.204618 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.224307 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.243843 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.257514 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.279088 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.294454 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.294523 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.294541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.294567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.294585 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.311468 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.332828 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.366958 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.390561 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.397185 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.397214 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.397223 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.397239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.397247 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.407988 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.418784 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.449844 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.465515 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.480956 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.497606 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.499525 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.499579 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.499631 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.499657 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.499674 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.515652 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.533065 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.550843 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.563608 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.583832 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.602004 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.602068 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.602086 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.602110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.602127 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.604562 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:54Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.705801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.705859 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.705884 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.705914 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.705934 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.808519 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.808557 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.808568 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.808583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.808594 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.821264 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.821283 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:54 crc kubenswrapper[4739]: E1008 21:48:54.821377 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:54 crc kubenswrapper[4739]: E1008 21:48:54.821579 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.821675 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:54 crc kubenswrapper[4739]: E1008 21:48:54.821748 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.910528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.910730 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.910813 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.910921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:54 crc kubenswrapper[4739]: I1008 21:48:54.910998 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:54Z","lastTransitionTime":"2025-10-08T21:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.013867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.014938 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.015055 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.015210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.015325 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.056809 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.059186 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.078933 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.102920 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.119303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.119346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.119358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.119379 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.119393 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.122385 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.140972 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.154547 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.170052 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.183687 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.199917 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.221760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.221807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.221821 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.221840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.221854 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.224448 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.239452 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.255549 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.280668 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.302414 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.319322 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.324350 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.324383 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.324394 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.324409 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.324419 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.352298 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:55Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.426760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.426797 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.426809 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.426823 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.426836 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.528990 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.529037 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.529048 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.529067 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.529079 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.631590 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.631854 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.631943 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.632026 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.632109 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.735078 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.735123 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.735138 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.735172 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.735185 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.837755 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.837817 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.837835 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.837859 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.837877 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.941259 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.941324 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.941341 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.941366 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:55 crc kubenswrapper[4739]: I1008 21:48:55.941385 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:55Z","lastTransitionTime":"2025-10-08T21:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.043840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.043882 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.043893 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.043910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.043921 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.067489 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a" exitCode=0 Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.067527 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.088815 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.103304 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.117920 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.129657 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.144934 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.146879 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.146930 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.146945 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.146962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.146974 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.162871 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.182263 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.198445 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.212060 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.226025 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.242867 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.250312 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.250370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.250390 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.250414 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.250432 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.256408 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.269763 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.282795 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.310021 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:56Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.352717 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.352773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.352786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.352807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.352825 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.455374 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.455414 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.455425 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.455442 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.455451 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.557786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.557819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.557828 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.557840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.557849 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.569469 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.569564 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.569618 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:49:12.569599839 +0000 UTC m=+52.394985599 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.569648 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.569688 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.569703 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:12.569686291 +0000 UTC m=+52.395072081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.569766 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.569801 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:12.569789823 +0000 UTC m=+52.395175573 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.659811 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.659954 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.659975 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.660029 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.660050 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.670245 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.670290 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670399 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670422 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670437 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670485 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:12.670469737 +0000 UTC m=+52.495855487 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670500 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670535 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670554 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.670606 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:12.67058868 +0000 UTC m=+52.495974460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.762685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.762737 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.762755 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.762779 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.762795 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.821000 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.821045 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.821169 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.821336 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.821423 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:56 crc kubenswrapper[4739]: E1008 21:48:56.822238 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.866272 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.866321 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.866333 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.866350 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.866359 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.969435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.969492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.969511 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.969531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:56 crc kubenswrapper[4739]: I1008 21:48:56.969545 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:56Z","lastTransitionTime":"2025-10-08T21:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.075390 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.081959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.082018 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.082035 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.082061 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.082080 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.114925 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.131685 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.160752 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.184894 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.185375 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.185446 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.185471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.185500 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.185527 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.203558 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.225246 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.241790 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.256915 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.271882 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.288888 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.288935 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.288946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.288966 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.288978 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.290060 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.302720 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.317661 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.329039 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.344065 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.350672 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.350708 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.350719 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.350732 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.350741 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.373756 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.382595 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.394131 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.394188 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.394200 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.394215 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.394225 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.433052 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.438237 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.438267 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.438274 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.438287 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.438297 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.450568 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.454006 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.454046 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.454055 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.454070 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.454080 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.465762 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.468936 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.468970 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.468981 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.468997 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.469007 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.480742 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:57Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:57 crc kubenswrapper[4739]: E1008 21:48:57.480855 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.482496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.482539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.482549 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.482563 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.482572 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.584551 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.584585 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.584597 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.584613 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.584622 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.686486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.686523 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.686532 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.686545 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.686555 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.789046 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.789109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.789117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.789130 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.789160 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.891642 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.891687 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.891700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.891718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.891730 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.994064 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.994102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.994117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.994135 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:57 crc kubenswrapper[4739]: I1008 21:48:57.994166 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:57Z","lastTransitionTime":"2025-10-08T21:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.081287 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd" exitCode=0 Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.081378 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.086540 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.097890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.097937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.097949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.097964 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.097984 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.101722 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.114944 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.129797 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.142988 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.152053 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.164205 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.175977 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.185429 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.194788 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.200446 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.200478 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.200487 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.200500 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.200512 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.209416 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.224509 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.237163 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.246593 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.260357 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.275000 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:58Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.303040 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.303082 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.303091 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.303107 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.303117 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.405178 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.405218 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.405227 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.405243 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.405252 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.507519 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.507581 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.507603 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.507631 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.507656 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.610975 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.611010 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.611018 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.611034 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.611043 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.713583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.713630 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.713640 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.713658 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.713669 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.816657 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.816716 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.816725 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.816741 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.816752 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.820741 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.820794 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.820803 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:48:58 crc kubenswrapper[4739]: E1008 21:48:58.820832 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:48:58 crc kubenswrapper[4739]: E1008 21:48:58.820886 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:48:58 crc kubenswrapper[4739]: E1008 21:48:58.821062 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.919685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.919733 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.919749 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.919771 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:58 crc kubenswrapper[4739]: I1008 21:48:58.919789 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:58Z","lastTransitionTime":"2025-10-08T21:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.023175 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.023227 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.023262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.023290 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.023310 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.090439 4739 generic.go:334] "Generic (PLEG): container finished" podID="e6074d7a-f433-42bf-8c80-71963ba57484" containerID="767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f" exitCode=0 Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.090476 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerDied","Data":"767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.111821 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.125677 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.125698 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.125706 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.125719 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.125728 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.130570 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.145225 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.157187 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.167390 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.179826 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.191050 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.203595 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.216114 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.228645 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.228687 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.228703 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.228728 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.228746 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.233265 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.248806 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.263736 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.274657 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.287525 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.305609 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:48:59Z is after 2025-08-24T17:21:41Z" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.330707 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.330757 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.330769 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.330786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.330800 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.433024 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.433224 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.433328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.433416 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.433505 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.535897 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.535937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.535951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.535969 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.535982 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.638388 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.638432 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.638444 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.638465 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.638479 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.741090 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.741136 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.741170 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.741194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.741209 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.843331 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.843377 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.843390 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.843409 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.843425 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.945588 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.945618 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.945626 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.945639 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:48:59 crc kubenswrapper[4739]: I1008 21:48:59.945647 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:48:59Z","lastTransitionTime":"2025-10-08T21:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.046965 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk"] Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.047430 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.048163 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.048202 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.048213 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.048229 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.048242 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.049968 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.050129 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.074463 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.096823 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.097242 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" event={"ID":"e6074d7a-f433-42bf-8c80-71963ba57484","Type":"ContainerStarted","Data":"1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.102466 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.102836 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.112197 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.126579 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.133486 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.139601 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.150787 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.150824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.150837 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.150856 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.150869 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.154480 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.167450 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.181278 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.194430 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.206775 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.208390 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xzl\" (UniqueName: \"kubernetes.io/projected/4c9869b0-41e2-4d5b-9492-3067503ae6bb-kube-api-access-x8xzl\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.208475 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.208499 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.208877 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.219781 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.232743 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.245726 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.253460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.253519 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.253533 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.253555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.253570 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.257800 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.271174 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.288243 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.300306 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.310059 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.310142 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xzl\" (UniqueName: \"kubernetes.io/projected/4c9869b0-41e2-4d5b-9492-3067503ae6bb-kube-api-access-x8xzl\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.310188 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.310207 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.311373 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.312362 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.313743 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.316816 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c9869b0-41e2-4d5b-9492-3067503ae6bb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.327904 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.329679 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xzl\" (UniqueName: \"kubernetes.io/projected/4c9869b0-41e2-4d5b-9492-3067503ae6bb-kube-api-access-x8xzl\") pod \"ovnkube-control-plane-749d76644c-7dswk\" (UID: \"4c9869b0-41e2-4d5b-9492-3067503ae6bb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.340506 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.353272 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.355923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.355946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.355953 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.355966 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.355975 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.361506 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.370174 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: W1008 21:49:00.377458 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9869b0_41e2_4d5b_9492_3067503ae6bb.slice/crio-ef7384bb5fbf41e6efc656aaa3929301e7f6cd4da6da6aaf82b60459231b4da6 WatchSource:0}: Error finding container ef7384bb5fbf41e6efc656aaa3929301e7f6cd4da6da6aaf82b60459231b4da6: Status 404 returned error can't find the container with id ef7384bb5fbf41e6efc656aaa3929301e7f6cd4da6da6aaf82b60459231b4da6 Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.384510 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.395256 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.411691 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.433387 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.455303 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.459678 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.459712 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.459723 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.459743 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.459759 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.469335 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.484760 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.499395 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.509356 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.521742 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:00Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.561876 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.561919 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.561931 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.561950 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.561962 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.664440 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.664477 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.664487 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.664502 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.664512 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.766460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.766503 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.766516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.766533 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.766545 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.821184 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.821184 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.821307 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:00 crc kubenswrapper[4739]: E1008 21:49:00.821445 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:00 crc kubenswrapper[4739]: E1008 21:49:00.821535 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:00 crc kubenswrapper[4739]: E1008 21:49:00.821772 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.869244 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.869466 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.869589 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.869668 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.869725 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.972071 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.972104 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.972112 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.972126 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:00 crc kubenswrapper[4739]: I1008 21:49:00.972136 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:00Z","lastTransitionTime":"2025-10-08T21:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.074181 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.074209 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.074217 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.074230 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.074238 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.106366 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" event={"ID":"4c9869b0-41e2-4d5b-9492-3067503ae6bb","Type":"ContainerStarted","Data":"ef7384bb5fbf41e6efc656aaa3929301e7f6cd4da6da6aaf82b60459231b4da6"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.107086 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.106494 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.140466 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.160286 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.177320 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.177358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.177369 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.177384 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.177461 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.181959 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.209636 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.224647 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.247839 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.262434 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.275976 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.279662 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.279762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.279781 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.279824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.279841 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.288116 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.303007 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.315938 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.326524 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.337747 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.349116 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.359908 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.371033 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.380344 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.381762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.381796 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.381806 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.381823 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.381831 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.484103 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.484161 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.484170 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.484184 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.484195 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.542327 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kdt6j"] Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.543040 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: E1008 21:49:01.543139 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.555323 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.567939 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.582156 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.585894 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.585922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.585930 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.585946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.585956 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.591451 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.601801 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.612390 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.623045 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.632418 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.646479 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.663613 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.675140 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.688572 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.688610 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.688620 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.688637 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.688648 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.696536 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.711016 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.722515 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.722578 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgc9s\" (UniqueName: \"kubernetes.io/projected/8629e121-2c64-4b46-adbd-ec1433ec0835-kube-api-access-zgc9s\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.723903 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.736984 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.749000 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.762262 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.792098 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.792170 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.792185 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.792205 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.792221 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.823274 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgc9s\" (UniqueName: \"kubernetes.io/projected/8629e121-2c64-4b46-adbd-ec1433ec0835-kube-api-access-zgc9s\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.824176 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: E1008 21:49:01.824317 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:01 crc kubenswrapper[4739]: E1008 21:49:01.824385 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:02.324366982 +0000 UTC m=+42.149752732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.832745 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.842479 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgc9s\" (UniqueName: \"kubernetes.io/projected/8629e121-2c64-4b46-adbd-ec1433ec0835-kube-api-access-zgc9s\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.851835 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.862338 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.871561 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.880837 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.894496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.894535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.894548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.894565 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.894581 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.897384 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.909444 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.923121 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.934819 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.945571 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.955836 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.968265 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.983489 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.994310 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:01Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.997486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.997514 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.997532 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.997548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:01 crc kubenswrapper[4739]: I1008 21:49:01.997560 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:01Z","lastTransitionTime":"2025-10-08T21:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.008258 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:02Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.020563 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:02Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.033293 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:02Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.099700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.099827 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.099889 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.099954 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.100087 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.110074 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.110928 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" event={"ID":"4c9869b0-41e2-4d5b-9492-3067503ae6bb","Type":"ContainerStarted","Data":"e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.202255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.202304 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.202317 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.202333 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.202346 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.304859 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.304911 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.304925 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.304947 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.304960 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.332565 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:02 crc kubenswrapper[4739]: E1008 21:49:02.332734 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:02 crc kubenswrapper[4739]: E1008 21:49:02.332798 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:03.332781056 +0000 UTC m=+43.158166816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.407215 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.407262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.407273 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.407293 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.407305 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.510544 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.510632 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.510642 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.510656 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.510665 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.612884 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.612934 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.612945 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.612958 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.612967 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.715247 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.715294 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.715303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.715318 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.715327 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.818773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.818815 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.818827 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.818841 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.818853 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.821321 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.821355 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.821418 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:02 crc kubenswrapper[4739]: E1008 21:49:02.821584 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:02 crc kubenswrapper[4739]: E1008 21:49:02.821675 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:02 crc kubenswrapper[4739]: E1008 21:49:02.821778 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.920917 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.920959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.920968 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.920984 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:02 crc kubenswrapper[4739]: I1008 21:49:02.920995 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:02Z","lastTransitionTime":"2025-10-08T21:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.023184 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.023233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.023246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.023264 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.023276 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.051032 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.115112 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" event={"ID":"4c9869b0-41e2-4d5b-9492-3067503ae6bb","Type":"ContainerStarted","Data":"70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.125207 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.125253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.125266 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.125282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.125294 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.141338 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.155595 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.170569 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.182420 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.199738 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.210943 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.221608 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.227842 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.227897 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.227914 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.227937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.227954 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.235631 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.253995 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.265438 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.276736 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.287681 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.298119 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.308977 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.319741 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.330037 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.330863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.330939 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.330962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.330992 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.331019 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.342383 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:03 crc kubenswrapper[4739]: E1008 21:49:03.342487 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:03 crc kubenswrapper[4739]: E1008 21:49:03.342539 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:05.34252585 +0000 UTC m=+45.167911600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.343524 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:03Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.433444 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.433473 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.433481 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.433494 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.433502 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.536480 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.536562 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.536587 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.536619 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.536641 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.639474 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.639531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.639548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.639571 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.639590 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.742192 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.742223 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.742231 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.742244 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.742253 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.821213 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:03 crc kubenswrapper[4739]: E1008 21:49:03.821360 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.845819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.845857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.845868 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.845885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.845898 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.948131 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.948194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.948205 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.948222 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:03 crc kubenswrapper[4739]: I1008 21:49:03.948235 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:03Z","lastTransitionTime":"2025-10-08T21:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.050404 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.050460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.050478 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.050497 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.050515 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.138331 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" probeResult="failure" output="" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.153114 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.153377 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.153469 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.153556 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.153659 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.255891 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.255927 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.255935 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.255950 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.255959 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.358547 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.358860 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.358871 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.358889 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.358900 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.460717 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.460762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.460777 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.460798 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.460856 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.562836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.562879 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.562892 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.562910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.562924 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.669283 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.669337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.669351 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.669370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.669383 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.771947 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.771980 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.771990 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.772004 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.772013 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.821601 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.821692 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.821712 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:04 crc kubenswrapper[4739]: E1008 21:49:04.821836 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:04 crc kubenswrapper[4739]: E1008 21:49:04.821932 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:04 crc kubenswrapper[4739]: E1008 21:49:04.822104 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.874962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.875009 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.875021 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.875036 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.875047 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.977708 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.977785 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.977802 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.977826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:04 crc kubenswrapper[4739]: I1008 21:49:04.977844 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:04Z","lastTransitionTime":"2025-10-08T21:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.080313 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.080368 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.080384 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.080408 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.080427 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.123942 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/0.log" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.127287 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5" exitCode=1 Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.127330 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.128170 4739 scope.go:117] "RemoveContainer" containerID="f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.143563 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.162184 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.177356 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.182664 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.182727 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.182746 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.182772 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.182789 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.190508 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.207278 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.221741 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.235215 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.254013 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.284291 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:04Z\\\",\\\"message\\\":\\\" 6063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 21:49:04.462128 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:04.462203 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:04.462214 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:04.462229 6063 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:04.462244 6063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:04.462282 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:04.462293 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:04.462309 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:04.462326 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:04.462330 6063 factory.go:656] Stopping watch factory\\\\nI1008 21:49:04.462334 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21:49:04.462350 6063 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:04.462348 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:04.462378 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:04.462398 6063 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.285089 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.285110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.285119 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.285134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.285159 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.300473 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.316888 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.331871 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.347431 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.358350 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.362559 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:05 crc kubenswrapper[4739]: E1008 21:49:05.362786 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:05 crc kubenswrapper[4739]: E1008 21:49:05.362885 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:09.362855289 +0000 UTC m=+49.188241099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.367413 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.379381 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.387793 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.387829 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.387837 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.387852 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.387861 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.398175 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:05Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.490491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.490534 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.490542 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.490557 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.490567 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.592401 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.592468 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.592480 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.592498 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.592509 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.695412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.695483 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.695496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.695513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.695525 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.798543 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.798583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.798594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.798612 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.798623 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.821289 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:05 crc kubenswrapper[4739]: E1008 21:49:05.821477 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.901017 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.901283 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.901294 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.901307 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:05 crc kubenswrapper[4739]: I1008 21:49:05.901317 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:05Z","lastTransitionTime":"2025-10-08T21:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.003281 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.003492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.003594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.003674 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.003754 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.105890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.106253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.106376 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.106466 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.106544 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.130927 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/1.log" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.131409 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/0.log" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.133499 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d" exitCode=1 Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.133575 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.133618 4739 scope.go:117] "RemoveContainer" containerID="f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.134545 4739 scope.go:117] "RemoveContainer" containerID="3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d" Oct 08 21:49:06 crc kubenswrapper[4739]: E1008 21:49:06.134674 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.148107 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.173538 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:04Z\\\",\\\"message\\\":\\\" 6063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 21:49:04.462128 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:04.462203 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:04.462214 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:04.462229 6063 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:04.462244 6063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:04.462282 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:04.462293 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:04.462309 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:04.462326 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:04.462330 6063 factory.go:656] Stopping watch factory\\\\nI1008 21:49:04.462334 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21:49:04.462350 6063 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:04.462348 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:04.462378 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:04.462398 6063 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.184895 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.209335 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.209366 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.209374 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.209391 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.209400 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.214421 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.226708 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.242316 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.255296 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.265514 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.276757 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.289454 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.303088 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.312163 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.312188 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.312198 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.312212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.312223 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.315869 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.328435 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.342959 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.356906 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.371821 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.382656 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:06Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.414503 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.414539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.414551 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.414566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.414577 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.517134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.517382 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.517508 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.517594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.517670 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.619755 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.619839 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.619855 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.619880 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.619897 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.722733 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.722789 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.722804 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.722827 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.722843 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.821257 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.821292 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.821272 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:06 crc kubenswrapper[4739]: E1008 21:49:06.821399 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:06 crc kubenswrapper[4739]: E1008 21:49:06.821509 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:06 crc kubenswrapper[4739]: E1008 21:49:06.821651 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.825631 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.825672 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.825685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.825702 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.825715 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.928451 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.928485 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.928495 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.928508 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:06 crc kubenswrapper[4739]: I1008 21:49:06.928517 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:06Z","lastTransitionTime":"2025-10-08T21:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.030505 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.030542 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.030552 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.030569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.030580 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.133750 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.134263 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.134277 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.134311 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.134324 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.138561 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/1.log" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.237801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.238762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.238948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.239130 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.239411 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.342240 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.342654 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.343025 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.343421 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.347101 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.450212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.450265 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.450281 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.450301 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.450317 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.535213 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.535261 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.535269 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.535282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.535290 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.549037 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:07Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.552128 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.552166 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.552176 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.552188 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.552198 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.565023 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:07Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.569424 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.569535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.569617 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.569713 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.569806 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.583558 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:07Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.587436 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.587486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.587513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.587535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.587551 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.602411 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:07Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.606173 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.606201 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.606212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.606224 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.606234 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.619846 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:07Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.619956 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.621254 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.621374 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.621496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.621626 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.621723 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.725082 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.725190 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.725209 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.725234 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.725250 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.821057 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:07 crc kubenswrapper[4739]: E1008 21:49:07.821840 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.828887 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.828919 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.828929 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.828942 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.828955 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.931778 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.931871 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.931890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.931920 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:07 crc kubenswrapper[4739]: I1008 21:49:07.931939 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:07Z","lastTransitionTime":"2025-10-08T21:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.034938 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.035015 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.035040 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.035070 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.035094 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.137894 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.137947 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.137960 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.137979 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.137995 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.240492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.240534 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.240546 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.240567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.240581 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.343282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.343335 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.343347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.343365 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.343378 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.445902 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.445949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.445971 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.445991 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.446005 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.548377 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.548449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.548475 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.548520 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.548543 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.650701 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.650779 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.650789 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.650804 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.650813 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.753341 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.753406 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.753424 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.753451 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.753469 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.821212 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.821241 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.821291 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:08 crc kubenswrapper[4739]: E1008 21:49:08.821379 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:08 crc kubenswrapper[4739]: E1008 21:49:08.821567 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:08 crc kubenswrapper[4739]: E1008 21:49:08.821784 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.855136 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.855197 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.855209 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.855223 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.855236 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.956909 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.956972 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.956988 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.957012 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:08 crc kubenswrapper[4739]: I1008 21:49:08.957030 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:08Z","lastTransitionTime":"2025-10-08T21:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.060108 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.060193 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.060210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.060233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.060251 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.162683 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.162753 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.162779 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.162809 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.162834 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.266567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.266625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.266644 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.266673 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.266695 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.369759 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.369831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.369854 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.369884 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.369903 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.404833 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:09 crc kubenswrapper[4739]: E1008 21:49:09.405001 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:09 crc kubenswrapper[4739]: E1008 21:49:09.405101 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:17.405068964 +0000 UTC m=+57.230454754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.473007 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.473072 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.473110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.473191 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.473222 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.575841 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.575905 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.575923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.575948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.575965 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.679088 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.679129 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.679137 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.679183 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.679200 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.785948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.786013 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.786033 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.786057 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.786076 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.821076 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:09 crc kubenswrapper[4739]: E1008 21:49:09.821246 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.892557 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.892614 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.892628 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.892647 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.892660 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.995507 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.995555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.995567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.995583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:09 crc kubenswrapper[4739]: I1008 21:49:09.995596 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:09Z","lastTransitionTime":"2025-10-08T21:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.097566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.097624 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.097643 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.097666 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.097686 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.200337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.200420 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.200451 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.200480 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.200515 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.302918 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.302972 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.302986 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.303006 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.303021 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.405375 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.405410 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.405423 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.405438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.405449 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.508956 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.509010 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.509026 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.509048 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.509066 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.611366 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.611417 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.611428 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.611447 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.611462 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.714555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.714606 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.714616 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.714631 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.714643 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.817593 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.817650 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.817676 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.817699 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.817716 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.821089 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.821124 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.821093 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:10 crc kubenswrapper[4739]: E1008 21:49:10.821338 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:10 crc kubenswrapper[4739]: E1008 21:49:10.821395 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:10 crc kubenswrapper[4739]: E1008 21:49:10.821496 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.920140 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.920290 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.920308 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.920333 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:10 crc kubenswrapper[4739]: I1008 21:49:10.920356 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:10Z","lastTransitionTime":"2025-10-08T21:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.022621 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.022654 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.022662 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.022675 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.022683 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.125604 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.125661 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.125677 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.125697 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.125713 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.228110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.228209 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.228232 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.228265 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.228290 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.330838 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.330909 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.330931 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.330958 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.330978 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.433654 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.433754 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.433770 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.433793 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.433811 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.536681 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.536717 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.536727 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.536741 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.536750 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.639371 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.639435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.639453 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.639475 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.639492 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.742176 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.742252 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.742268 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.742291 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.742311 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.821720 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:11 crc kubenswrapper[4739]: E1008 21:49:11.821914 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.837138 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.844646 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.844744 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.844763 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.844786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.844805 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.855844 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.870087 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.885739 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.903008 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.923263 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.944674 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.948002 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.948058 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.948083 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.948115 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.948138 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:11Z","lastTransitionTime":"2025-10-08T21:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.961867 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:11 crc kubenswrapper[4739]: I1008 21:49:11.986632 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:11Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.011828 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:04Z\\\",\\\"message\\\":\\\" 6063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 21:49:04.462128 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:04.462203 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:04.462214 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:04.462229 6063 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:04.462244 6063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:04.462282 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:04.462293 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:04.462309 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:04.462326 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:04.462330 6063 factory.go:656] Stopping watch factory\\\\nI1008 21:49:04.462334 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21:49:04.462350 6063 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:04.462348 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:04.462378 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:04.462398 6063 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.030540 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.050047 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.050088 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.050101 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.050117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.050130 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.063137 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.081719 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.100134 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.115522 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.129474 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.144563 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:12Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.152034 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.152063 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.152071 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.152084 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.152093 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.254485 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.254518 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.254526 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.254541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.254554 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.357116 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.357214 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.357232 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.357256 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.357273 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.459466 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.459521 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.459538 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.459561 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.459578 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.562025 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.562068 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.562080 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.562095 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.562106 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.646555 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.646670 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.646715 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.646793 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.646846 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:44.646830639 +0000 UTC m=+84.472216389 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.646894 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.646935 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:44.646924792 +0000 UTC m=+84.472310542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.647016 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:49:44.647003904 +0000 UTC m=+84.472389654 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.666983 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.667022 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.667033 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.667048 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.667060 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.747947 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.747994 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748095 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748112 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748123 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748193 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:44.748180671 +0000 UTC m=+84.573566421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748216 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748268 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748291 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.748368 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:44.748341205 +0000 UTC m=+84.573726995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.769370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.769415 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.769424 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.769438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.769447 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.821632 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.821689 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.821693 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.821763 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.821952 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:12 crc kubenswrapper[4739]: E1008 21:49:12.822138 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.871734 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.871780 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.871791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.871808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.871819 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.974772 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.974878 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.974899 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.974922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:12 crc kubenswrapper[4739]: I1008 21:49:12.974939 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:12Z","lastTransitionTime":"2025-10-08T21:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.077735 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.077910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.077940 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.077973 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.077996 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.180715 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.180824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.180843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.180921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.180946 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.283032 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.283160 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.283182 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.283207 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.283225 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.386632 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.386694 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.386711 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.386734 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.386753 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.489314 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.489361 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.489374 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.489393 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.489405 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.592461 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.592496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.592506 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.592519 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.592527 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.695667 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.695766 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.695790 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.695821 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.695844 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.799197 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.799235 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.799245 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.799258 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.799268 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.822032 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:13 crc kubenswrapper[4739]: E1008 21:49:13.822195 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.902562 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.902608 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.902618 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.902633 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:13 crc kubenswrapper[4739]: I1008 21:49:13.902680 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:13Z","lastTransitionTime":"2025-10-08T21:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.004660 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.004718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.004736 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.004762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.004781 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.107309 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.107399 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.107424 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.107453 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.107471 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.210192 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.210250 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.210267 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.210293 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.210311 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.313875 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.313937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.313953 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.314013 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.314031 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.417035 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.417096 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.417116 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.417175 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.417194 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.519934 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.520013 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.520035 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.520070 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.520093 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.623529 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.623577 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.623589 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.623606 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.623619 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.726510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.726801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.726863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.726937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.727036 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.821718 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.821735 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:14 crc kubenswrapper[4739]: E1008 21:49:14.822014 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:14 crc kubenswrapper[4739]: E1008 21:49:14.821871 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.822269 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:14 crc kubenswrapper[4739]: E1008 21:49:14.822524 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.829640 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.829786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.829872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.829959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.830053 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.932658 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.932696 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.932707 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.932724 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:14 crc kubenswrapper[4739]: I1008 21:49:14.932737 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:14Z","lastTransitionTime":"2025-10-08T21:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.035068 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.035354 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.035484 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.035578 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.035700 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.143055 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.143099 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.143111 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.143127 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.143205 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.245563 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.245600 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.245608 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.245621 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.245629 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.347711 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.347778 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.347791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.347807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.347820 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.449680 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.449715 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.449725 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.449740 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.449752 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.552264 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.552305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.552317 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.552331 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.552341 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.655002 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.655047 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.655059 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.655076 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.655090 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.758233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.758269 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.758278 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.758292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.758301 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.821553 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:15 crc kubenswrapper[4739]: E1008 21:49:15.821687 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.860846 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.860890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.860902 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.860919 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.860932 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.963363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.963471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.963490 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.963513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:15 crc kubenswrapper[4739]: I1008 21:49:15.963531 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:15Z","lastTransitionTime":"2025-10-08T21:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.065700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.065985 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.066079 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.066190 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.066276 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.168898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.168934 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.168946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.168961 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.168973 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.272565 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.272983 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.273275 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.273476 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.273675 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.377912 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.377965 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.377974 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.377988 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.377997 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.480993 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.481039 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.481054 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.481071 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.481082 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.584233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.584605 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.584744 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.584898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.585026 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.628080 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.641727 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.644317 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.670351 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f74d2f2292383c8f6b54b544a3b08f848745933b0a00da170c79dc3512c458e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:04Z\\\",\\\"message\\\":\\\" 6063 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 21:49:04.462128 6063 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:04.462203 6063 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:04.462214 6063 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:04.462229 6063 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:04.462244 6063 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:04.462282 6063 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:04.462293 6063 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:04.462309 6063 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:04.462326 6063 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:04.462330 6063 factory.go:656] Stopping watch factory\\\\nI1008 21:49:04.462334 6063 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21:49:04.462350 6063 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:04.462348 6063 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:04.462378 6063 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:04.462398 6063 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.684108 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.687905 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.687941 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.687950 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.687965 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.687977 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.694520 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.705114 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.733310 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.746219 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.765210 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.777999 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.788362 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.790124 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.790288 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.790365 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.790436 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.790599 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.798514 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.813105 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.821576 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.821581 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.821640 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:16 crc kubenswrapper[4739]: E1008 21:49:16.822508 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:16 crc kubenswrapper[4739]: E1008 21:49:16.822587 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:16 crc kubenswrapper[4739]: E1008 21:49:16.822653 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.826776 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.841187 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.853334 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.864705 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.875225 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:16Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.893246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.893280 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.893289 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.893304 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.893313 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.996253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.996500 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.996567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.996639 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:16 crc kubenswrapper[4739]: I1008 21:49:16.996708 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:16Z","lastTransitionTime":"2025-10-08T21:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.099138 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.099198 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.099208 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.099220 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.099232 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.203663 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.203963 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.204166 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.204299 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.204363 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.307044 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.307281 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.307389 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.307460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.307528 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.410419 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.410449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.410458 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.410470 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.410478 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.495362 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.495601 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.495721 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:49:33.495699588 +0000 UTC m=+73.321085358 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.512634 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.512898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.512983 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.513079 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.513235 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.616063 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.616108 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.616119 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.616136 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.616162 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.718139 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.718196 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.718204 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.718218 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.718227 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.810793 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.810835 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.810844 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.810858 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.810867 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.821281 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.821392 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.829022 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:17Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.832839 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.832880 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.832898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.832920 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.832950 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.848001 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:17Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.850921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.851017 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.851029 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.851043 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.851054 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.865727 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:17Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.868904 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.868932 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.868942 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.868958 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.868969 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.882538 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:17Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.885915 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.885951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.885963 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.885979 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.885994 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.907819 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:17Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:17 crc kubenswrapper[4739]: E1008 21:49:17.908006 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.909502 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.909539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.909550 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.909566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:17 crc kubenswrapper[4739]: I1008 21:49:17.909577 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:17Z","lastTransitionTime":"2025-10-08T21:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.011860 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.011902 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.011930 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.011946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.011959 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.114418 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.114459 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.114473 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.114491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.114503 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.217811 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.217872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.217932 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.217958 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.217977 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.321231 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.321268 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.321278 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.321292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.321303 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.423061 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.423102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.423114 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.423129 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.423166 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.525666 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.525732 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.525743 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.525760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.525772 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.627889 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.627934 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.627950 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.627970 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.627985 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.730689 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.730718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.730726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.730738 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.730747 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.820602 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.820616 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.820767 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:18 crc kubenswrapper[4739]: E1008 21:49:18.820722 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:18 crc kubenswrapper[4739]: E1008 21:49:18.821221 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:18 crc kubenswrapper[4739]: E1008 21:49:18.821359 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.821444 4739 scope.go:117] "RemoveContainer" containerID="3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.832847 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.832891 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.832904 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.832925 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.832936 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.836785 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.855869 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.880187 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.892007 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.909093 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.929359 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.936297 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.936346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.936359 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.936383 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.936421 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:18Z","lastTransitionTime":"2025-10-08T21:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.941254 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.952085 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.969891 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:18 crc kubenswrapper[4739]: I1008 21:49:18.984392 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.000024 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:18Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.014747 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.032037 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.038727 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.038773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.038785 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.038803 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.038814 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.043970 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.054273 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.069341 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.087193 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.097480 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.141367 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.141408 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.141422 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.141438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.141449 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.180823 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/1.log" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.183225 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.184094 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.205630 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.225271 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.240091 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.243625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.243669 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.243681 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.243696 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.243708 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.255919 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.264972 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.274494 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.293502 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.304415 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.317294 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.328943 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.340571 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.346101 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.346193 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.346208 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.346233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.346248 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.351439 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.363107 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.375744 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.386635 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.398986 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.409482 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.419691 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:19Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.448342 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.448392 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.448404 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.448422 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.448434 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.551399 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.552031 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.552256 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.552406 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.552479 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.654813 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.655032 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.655117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.655213 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.655285 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.757706 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.757752 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.757763 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.757782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.757795 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.820983 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:19 crc kubenswrapper[4739]: E1008 21:49:19.821223 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.859564 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.859759 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.859842 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.859931 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.860001 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.962541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.962576 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.962584 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.962596 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:19 crc kubenswrapper[4739]: I1008 21:49:19.962606 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:19Z","lastTransitionTime":"2025-10-08T21:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.066642 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.066904 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.066925 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.066947 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.066964 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.169753 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.169805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.169819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.169838 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.169850 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.188048 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/2.log" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.188801 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/1.log" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.191901 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" exitCode=1 Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.191946 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.191978 4739 scope.go:117] "RemoveContainer" containerID="3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.192631 4739 scope.go:117] "RemoveContainer" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" Oct 08 21:49:20 crc kubenswrapper[4739]: E1008 21:49:20.192803 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.208044 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.225487 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.238787 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.252369 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.269609 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.272453 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.272494 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.272508 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.272527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.272543 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.292326 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3851f913e0995f723e741b3231cdc6c9093878fb3d47a03924b20a383bb3149d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:06Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 21:49:06.001469 6249 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:06.001731 6249 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:06.001759 6249 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:06.001764 6249 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:06.001794 6249 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:06.001802 6249 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:06.001811 6249 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:06.001822 6249 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:06.001922 6249 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:06.002226 6249 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:06.002247 6249 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:06.002306 6249 factory.go:656] Stopping watch factory\\\\nI1008 21:49:06.002329 6249 ovnkube.go:599] Stopped ovnkube\\\\nI1008 21:49:06.002379 6249 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:06.002397 6249 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1008 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.304991 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.318494 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.330372 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.343214 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.364080 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.374408 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.374437 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.374448 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.374462 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.374471 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.380812 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.401453 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.414251 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.433549 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.449686 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.464752 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477822 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477905 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477921 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.477976 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:20Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.580194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.580261 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.580285 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.580313 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.580334 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.682792 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.682827 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.682838 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.682855 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.682868 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.785492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.785530 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.785539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.785552 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.785559 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.821288 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.821431 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:20 crc kubenswrapper[4739]: E1008 21:49:20.821472 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.821290 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:20 crc kubenswrapper[4739]: E1008 21:49:20.821601 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:20 crc kubenswrapper[4739]: E1008 21:49:20.821768 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.888462 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.888503 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.888514 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.888532 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.888544 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.991496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.991555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.991576 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.991605 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:20 crc kubenswrapper[4739]: I1008 21:49:20.991629 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:20Z","lastTransitionTime":"2025-10-08T21:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.094012 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.094059 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.094069 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.094083 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.094130 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.196055 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.196104 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.196122 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.196204 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.196233 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.197794 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/2.log" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.202806 4739 scope.go:117] "RemoveContainer" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" Oct 08 21:49:21 crc kubenswrapper[4739]: E1008 21:49:21.203927 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.220758 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.253950 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.267550 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.294084 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.299025 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.299067 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.299079 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.299097 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.299112 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.312801 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.331850 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.351012 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.363292 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.378404 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.397083 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.401312 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.401337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.401348 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.401363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.401375 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.415385 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.430019 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.447637 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.459922 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.476127 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.500611 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.504117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.504220 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.504234 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.504253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.504265 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.523753 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.543063 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.606510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.606568 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.606585 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.606608 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.606627 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.709188 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.709248 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.709269 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.709298 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.709322 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.812529 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.812566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.812575 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.812586 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.812596 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.820692 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:21 crc kubenswrapper[4739]: E1008 21:49:21.820868 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.837085 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.859009 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.871763 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.891066 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.904222 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.915227 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.915284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.915303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.915328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.915346 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:21Z","lastTransitionTime":"2025-10-08T21:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.921584 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.939525 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.950745 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.962310 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.975521 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:21 crc kubenswrapper[4739]: I1008 21:49:21.988346 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.000098 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:21Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.016457 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.017603 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.017639 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.017650 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.017668 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.017681 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.035106 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.052568 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.068739 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.082960 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.094001 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:22Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.120113 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.120138 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.120169 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.120184 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.120197 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.222701 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.222729 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.222738 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.222750 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.222758 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.326262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.326682 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.326696 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.326734 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.326762 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.428757 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.428822 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.428841 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.428865 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.428884 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.532195 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.532240 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.532255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.532274 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.532288 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.636464 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.636518 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.636536 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.636562 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.636583 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.739742 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.739798 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.739814 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.739837 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.739854 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.821105 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.821182 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:22 crc kubenswrapper[4739]: E1008 21:49:22.821232 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:22 crc kubenswrapper[4739]: E1008 21:49:22.821321 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.821175 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:22 crc kubenswrapper[4739]: E1008 21:49:22.821458 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.841765 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.841822 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.841839 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.841865 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.841883 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.944856 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.944922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.944940 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.944965 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:22 crc kubenswrapper[4739]: I1008 21:49:22.944983 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:22Z","lastTransitionTime":"2025-10-08T21:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.048350 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.048409 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.048426 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.048449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.048466 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.151898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.151940 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.151948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.151961 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.151976 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.255399 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.255460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.255477 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.255504 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.255524 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.358893 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.358949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.358962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.358984 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.359000 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.461694 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.461777 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.461824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.461849 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.461861 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.565787 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.565850 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.565867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.565890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.565907 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.669851 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.670770 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.671116 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.671467 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.671865 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.775783 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.775837 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.775850 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.775872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.775886 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.821693 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:23 crc kubenswrapper[4739]: E1008 21:49:23.821904 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.878538 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.878594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.878613 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.878634 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.878650 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.981475 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.981516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.981545 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.981558 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:23 crc kubenswrapper[4739]: I1008 21:49:23.981567 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:23Z","lastTransitionTime":"2025-10-08T21:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.088452 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.088516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.088535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.088559 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.088577 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.191683 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.191730 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.191747 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.191768 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.191786 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.293808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.293855 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.293872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.293894 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.293911 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.396060 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.396086 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.396094 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.396109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.396136 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.497785 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.497812 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.497819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.497831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.497839 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.601651 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.601691 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.601703 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.601718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.601731 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.704614 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.704666 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.704686 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.704710 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.704730 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.808052 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.808117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.808134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.808233 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.808251 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.821642 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.821667 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.821764 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:24 crc kubenswrapper[4739]: E1008 21:49:24.821924 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:24 crc kubenswrapper[4739]: E1008 21:49:24.822080 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:24 crc kubenswrapper[4739]: E1008 21:49:24.822270 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.910780 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.910836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.910852 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.910877 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:24 crc kubenswrapper[4739]: I1008 21:49:24.910895 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:24Z","lastTransitionTime":"2025-10-08T21:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.014069 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.014141 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.014193 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.014219 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.014237 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.116771 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.116843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.116861 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.116883 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.116900 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.219470 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.219524 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.219540 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.219562 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.219581 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.323025 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.323106 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.323126 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.323212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.323238 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.426503 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.426656 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.426689 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.426730 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.426752 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.529254 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.529327 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.529352 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.529379 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.529397 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.633129 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.633236 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.633255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.633280 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.633298 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.735921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.735981 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.735993 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.736010 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.736021 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.821091 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:25 crc kubenswrapper[4739]: E1008 21:49:25.821298 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.837861 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.837894 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.837901 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.837912 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.837921 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.940643 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.940693 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.940705 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.940722 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:25 crc kubenswrapper[4739]: I1008 21:49:25.940735 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:25Z","lastTransitionTime":"2025-10-08T21:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.043216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.043264 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.043281 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.043299 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.043311 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.145676 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.145727 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.145749 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.145777 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.145800 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.248381 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.248416 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.248426 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.248438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.248448 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.351515 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.351555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.351569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.351586 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.351599 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.454693 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.454743 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.454759 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.454783 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.454803 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.557527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.557560 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.557568 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.557581 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.557591 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.659697 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.659732 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.659741 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.659753 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.659763 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.767232 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.767303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.767321 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.767346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.767363 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.820974 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.821017 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.821059 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:26 crc kubenswrapper[4739]: E1008 21:49:26.821219 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:26 crc kubenswrapper[4739]: E1008 21:49:26.821310 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:26 crc kubenswrapper[4739]: E1008 21:49:26.821426 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.869645 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.869718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.869734 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.869752 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.869764 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.971981 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.972024 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.972035 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.972050 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:26 crc kubenswrapper[4739]: I1008 21:49:26.972060 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:26Z","lastTransitionTime":"2025-10-08T21:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.074296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.074337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.074347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.074362 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.074375 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.177367 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.177435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.177458 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.177487 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.177509 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.280206 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.280246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.280272 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.280289 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.280298 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.382923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.382986 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.383008 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.383038 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.383064 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.486109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.486169 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.486181 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.486197 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.486209 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.589202 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.589243 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.589255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.589271 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.589282 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.718721 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.718760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.718769 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.718787 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.718797 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.820957 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.821337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.821390 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.821405 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.821423 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.821437 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:27 crc kubenswrapper[4739]: E1008 21:49:27.822272 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.924531 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.924605 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.924629 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.924660 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:27 crc kubenswrapper[4739]: I1008 21:49:27.924680 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:27Z","lastTransitionTime":"2025-10-08T21:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.026910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.026967 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.026983 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.027007 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.027025 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.127566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.127698 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.127719 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.127748 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.127768 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.145575 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:28Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.149597 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.149645 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.149657 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.149676 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.149687 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.165488 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:28Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.169673 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.169700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.169711 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.169722 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.169747 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.185630 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:28Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.189664 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.189714 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.189736 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.189764 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.189789 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.207458 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:28Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.212638 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.212695 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.212716 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.212737 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.212753 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.231630 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:28Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.231916 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.233500 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.233532 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.233541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.233555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.233584 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.336469 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.336512 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.336520 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.336535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.336544 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.438799 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.438852 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.438866 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.438889 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.438907 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.541069 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.541109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.541117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.541134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.541169 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.643982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.644028 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.644098 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.644125 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.644137 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.745808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.745835 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.745843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.745855 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.745863 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.821518 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.821533 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.821526 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.821647 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.821736 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:28 crc kubenswrapper[4739]: E1008 21:49:28.821794 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.848216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.848250 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.848260 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.848272 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.848283 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.950412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.950472 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.950489 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.950510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:28 crc kubenswrapper[4739]: I1008 21:49:28.950526 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:28Z","lastTransitionTime":"2025-10-08T21:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.053168 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.053201 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.053210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.053223 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.053232 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.155587 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.155616 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.155624 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.155635 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.155645 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.257803 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.257834 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.257842 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.257857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.257868 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.360471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.360517 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.360527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.360543 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.360554 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.462927 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.462961 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.462972 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.463008 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.463019 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.565030 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.565102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.565117 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.565135 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.565201 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.667403 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.667444 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.667457 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.667471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.667480 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.770491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.770570 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.770591 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.770620 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.770642 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.821125 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:29 crc kubenswrapper[4739]: E1008 21:49:29.821403 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.873389 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.873468 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.873492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.873522 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.873543 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.975556 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.975600 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.975609 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.975625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:29 crc kubenswrapper[4739]: I1008 21:49:29.975635 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:29Z","lastTransitionTime":"2025-10-08T21:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.077729 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.077778 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.077786 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.077799 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.077811 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.180548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.180625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.180651 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.180682 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.180710 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.283109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.283146 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.283182 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.283196 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.283204 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.385560 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.385593 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.385605 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.385620 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.385630 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.487533 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.487571 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.487581 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.487593 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.487604 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.589942 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.589999 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.590019 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.590044 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.590061 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.691732 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.691805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.691828 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.691858 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.691880 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.794555 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.794597 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.794610 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.794627 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.794638 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.821023 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.821095 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.821105 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:30 crc kubenswrapper[4739]: E1008 21:49:30.821213 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:30 crc kubenswrapper[4739]: E1008 21:49:30.821291 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:30 crc kubenswrapper[4739]: E1008 21:49:30.821394 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.897139 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.897192 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.897200 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.897210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:30 crc kubenswrapper[4739]: I1008 21:49:30.897220 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:30Z","lastTransitionTime":"2025-10-08T21:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.000513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.000553 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.000560 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.000573 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.000582 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.102360 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.102423 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.102435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.102452 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.102463 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.204949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.205015 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.205039 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.205067 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.205087 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.307556 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.307622 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.307633 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.307647 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.307657 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.410213 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.410470 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.410535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.410615 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.410682 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.513014 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.513060 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.513070 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.513085 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.513097 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.615246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.615284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.615293 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.615306 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.615314 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.717925 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.717955 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.717963 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.717975 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.717983 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.820756 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821211 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821280 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821309 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821334 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: E1008 21:49:31.821295 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821351 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.821999 4739 scope.go:117] "RemoveContainer" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" Oct 08 21:49:31 crc kubenswrapper[4739]: E1008 21:49:31.822198 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.835067 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.847547 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.858540 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.870751 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.884027 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.897680 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.909642 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923192 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923596 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923614 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923623 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923635 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.923659 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:31Z","lastTransitionTime":"2025-10-08T21:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.935316 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.949969 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.976650 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.989187 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:31 crc kubenswrapper[4739]: I1008 21:49:31.997932 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:31Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.007915 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:32Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.026892 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.027000 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.027081 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.027189 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.027256 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.036799 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:32Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.052310 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:32Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.066765 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:32Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.078115 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:32Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.130237 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.130534 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.130622 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.130723 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.130802 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.233278 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.233347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.233361 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.233376 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.233387 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.336199 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.336230 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.336239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.336253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.336262 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.438126 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.438185 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.438194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.438208 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.438217 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.540726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.540767 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.540776 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.540791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.540800 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.643629 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.643658 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.643666 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.643679 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.643687 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.746284 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.746347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.746358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.746370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.746378 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.821164 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.821183 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:32 crc kubenswrapper[4739]: E1008 21:49:32.821299 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.821183 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:32 crc kubenswrapper[4739]: E1008 21:49:32.821420 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:32 crc kubenswrapper[4739]: E1008 21:49:32.821472 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.848535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.848583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.848593 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.848610 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.848621 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.950388 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.950429 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.950441 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.950457 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:32 crc kubenswrapper[4739]: I1008 21:49:32.950470 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:32Z","lastTransitionTime":"2025-10-08T21:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.053022 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.053056 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.053064 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.053077 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.053086 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.155178 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.155210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.155218 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.155230 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.155239 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.256690 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.256734 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.256744 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.256758 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.256768 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.359569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.359607 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.359616 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.359630 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.359638 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.461807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.461849 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.461859 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.461874 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.461883 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.564017 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.564076 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.564094 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.564120 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.564138 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.569848 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:33 crc kubenswrapper[4739]: E1008 21:49:33.569950 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:33 crc kubenswrapper[4739]: E1008 21:49:33.569996 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:05.569983697 +0000 UTC m=+105.395369447 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.666052 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.666102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.666113 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.666131 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.666177 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.767756 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.767799 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.767811 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.767826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.767838 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.821386 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:33 crc kubenswrapper[4739]: E1008 21:49:33.821576 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.832469 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.869885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.869914 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.869923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.869936 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.869946 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.971772 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.971809 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.971821 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.971836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:33 crc kubenswrapper[4739]: I1008 21:49:33.971849 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:33Z","lastTransitionTime":"2025-10-08T21:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.074875 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.074910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.074921 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.074936 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.074946 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.177818 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.177847 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.177857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.177875 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.177885 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.280230 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.280269 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.280278 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.280292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.280301 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.382268 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.382304 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.382312 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.382326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.382336 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.484550 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.484590 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.484599 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.484612 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.484621 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.586873 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.586925 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.586938 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.586954 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.586969 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.688851 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.688898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.688959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.688981 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.688992 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.791513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.791551 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.791561 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.791577 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.791587 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.821431 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.821491 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.821500 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:34 crc kubenswrapper[4739]: E1008 21:49:34.821632 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:34 crc kubenswrapper[4739]: E1008 21:49:34.821909 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:34 crc kubenswrapper[4739]: E1008 21:49:34.822243 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.894121 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.894179 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.894190 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.894205 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:34 crc kubenswrapper[4739]: I1008 21:49:34.894215 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:34.997239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:34.997455 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:34.997501 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:34.997579 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:34.997617 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:34Z","lastTransitionTime":"2025-10-08T21:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.100737 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.100791 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.100808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.100829 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.100845 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.203419 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.203458 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.203469 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.203483 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.203491 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.306305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.306337 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.306345 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.306388 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.306398 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.409449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.409516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.409539 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.409569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.409591 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.512570 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.512633 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.512655 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.512682 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.512703 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.615363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.615414 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.615423 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.615438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.615453 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.717890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.717955 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.717966 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.717982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.717994 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.819898 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.819927 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.819935 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.819946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.819979 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.821056 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:35 crc kubenswrapper[4739]: E1008 21:49:35.821174 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.922257 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.922287 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.922296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.922308 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:35 crc kubenswrapper[4739]: I1008 21:49:35.922316 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:35Z","lastTransitionTime":"2025-10-08T21:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.024205 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.024262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.024282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.024305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.024322 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.127104 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.127167 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.127180 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.127195 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.127206 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.230188 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.230223 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.230235 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.230251 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.230263 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.333203 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.333248 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.333259 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.333276 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.333289 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.436751 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.436831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.436851 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.436882 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.436905 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.539604 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.540008 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.540268 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.540469 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.540648 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.644203 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.644276 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.644298 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.644328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.644349 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.747831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.747871 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.747883 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.747897 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.747908 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.821476 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.821546 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:36 crc kubenswrapper[4739]: E1008 21:49:36.821620 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:36 crc kubenswrapper[4739]: E1008 21:49:36.821745 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.821497 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:36 crc kubenswrapper[4739]: E1008 21:49:36.822485 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.850949 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.851000 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.851019 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.851041 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.851057 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.954310 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.954358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.954370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.954387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:36 crc kubenswrapper[4739]: I1008 21:49:36.954396 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:36Z","lastTransitionTime":"2025-10-08T21:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.056778 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.056825 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.056835 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.056848 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.056858 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.159568 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.159627 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.159644 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.159668 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.159686 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.263066 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.263128 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.263174 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.263199 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.263216 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.365344 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.365408 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.365425 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.365448 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.365466 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.469215 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.469262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.469273 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.469288 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.469298 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.572453 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.572521 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.572544 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.572578 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.572601 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.681396 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.681809 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.682065 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.682296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.682463 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.785312 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.785366 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.785387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.785439 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.785460 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.821826 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:37 crc kubenswrapper[4739]: E1008 21:49:37.822011 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.888437 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.888471 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.888481 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.888496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.888506 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.991667 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.992053 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.992309 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.992558 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:37 crc kubenswrapper[4739]: I1008 21:49:37.992985 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:37Z","lastTransitionTime":"2025-10-08T21:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.096597 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.097323 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.097353 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.097375 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.097390 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.200387 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.200448 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.200464 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.200490 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.200525 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.302807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.302869 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.302888 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.302911 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.302927 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.353709 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.353801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.353826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.353863 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.353888 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.375422 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:38Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.381263 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.381317 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.381335 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.381360 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.381379 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.401859 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:38Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.407379 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.407434 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.407453 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.407477 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.407495 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.429035 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:38Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.434087 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.434169 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.434186 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.434210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.434228 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.455275 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:38Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.460257 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.460307 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.460325 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.460351 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.460368 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.484283 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:38Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.484617 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.487016 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.487081 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.487105 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.487136 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.487199 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.600071 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.600122 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.600140 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.600201 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.600218 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.702278 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.702388 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.702413 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.702445 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.702477 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.805111 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.805174 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.805190 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.805210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.805225 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.821388 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.821458 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.821523 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.821641 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.821825 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:38 crc kubenswrapper[4739]: E1008 21:49:38.821973 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.908493 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.908536 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.908546 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.908565 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:38 crc kubenswrapper[4739]: I1008 21:49:38.908577 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:38Z","lastTransitionTime":"2025-10-08T21:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.011103 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.011392 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.011460 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.011528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.011600 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.113982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.114053 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.114077 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.114106 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.114126 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.216243 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.216526 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.216625 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.216704 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.216771 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.319553 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.319585 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.319594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.319607 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.319619 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.422915 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.423006 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.423025 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.423051 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.423077 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.526365 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.526801 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.526994 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.527216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.527542 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.630836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.631271 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.631455 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.631723 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.631925 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.735781 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.735818 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.735829 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.735843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.735854 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.821835 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:39 crc kubenswrapper[4739]: E1008 21:49:39.822506 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.838721 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.838756 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.838764 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.838775 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.838786 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.941358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.941431 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.941454 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.941482 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:39 crc kubenswrapper[4739]: I1008 21:49:39.941503 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:39Z","lastTransitionTime":"2025-10-08T21:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.044634 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.044866 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.044942 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.045028 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.045109 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.148279 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.149257 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.149476 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.149685 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.149885 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.252776 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.253243 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.253268 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.253297 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.253320 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.256894 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/0.log" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.256955 4739 generic.go:334] "Generic (PLEG): container finished" podID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" containerID="d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58" exitCode=1 Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.256985 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerDied","Data":"d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.257425 4739 scope.go:117] "RemoveContainer" containerID="d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.276918 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.290739 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7389dea-44c5-4d76-8bc7-4d5d64df24b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.309935 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.325491 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.344201 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.356573 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.356605 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.356615 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.356629 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.356639 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.362215 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.378460 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.398309 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.413342 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.429164 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.449866 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.459447 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.459577 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.459663 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.459745 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.459775 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.487070 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.503869 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.535426 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.555093 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.562417 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.562462 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.562478 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.562501 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.562519 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.576665 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.595805 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"2025-10-08T21:48:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af\\\\n2025-10-08T21:48:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af to /host/opt/cni/bin/\\\\n2025-10-08T21:48:54Z [verbose] multus-daemon started\\\\n2025-10-08T21:48:54Z [verbose] Readiness Indicator file check\\\\n2025-10-08T21:49:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.611091 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.624042 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:40Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.664520 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.664564 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.664586 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.664615 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.664636 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.766692 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.766726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.766738 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.766754 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.766766 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.821557 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.821579 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.821638 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:40 crc kubenswrapper[4739]: E1008 21:49:40.821678 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:40 crc kubenswrapper[4739]: E1008 21:49:40.821766 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:40 crc kubenswrapper[4739]: E1008 21:49:40.821855 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.868766 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.868798 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.868807 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.868819 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.868827 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.971386 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.971482 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.971507 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.971541 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:40 crc kubenswrapper[4739]: I1008 21:49:40.971566 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:40Z","lastTransitionTime":"2025-10-08T21:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.074835 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.074877 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.074886 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.074901 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.074913 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.177122 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.177179 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.177191 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.177205 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.177217 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.262720 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/0.log" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.262795 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerStarted","Data":"9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.279688 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.279959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.280121 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.280277 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.280413 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.286137 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.308071 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.318422 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.343043 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.358422 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.375236 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.382952 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.383149 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.383294 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.383399 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.383486 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.394399 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"2025-10-08T21:48:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af\\\\n2025-10-08T21:48:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af to /host/opt/cni/bin/\\\\n2025-10-08T21:48:54Z [verbose] multus-daemon started\\\\n2025-10-08T21:48:54Z [verbose] Readiness Indicator file check\\\\n2025-10-08T21:49:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.403583 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.419841 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.432466 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.442770 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7389dea-44c5-4d76-8bc7-4d5d64df24b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.459250 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.473909 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.485464 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.485498 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.485510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.485525 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.485536 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.486954 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.498210 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.508847 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.519265 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.531664 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.543862 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.587674 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.587726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.587743 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.587765 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.587782 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.690115 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.690204 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.690222 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.690247 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.690267 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.792788 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.792824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.792834 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.792851 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.792860 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.821477 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:41 crc kubenswrapper[4739]: E1008 21:49:41.822459 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.853466 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.876542 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.896502 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.896760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.896936 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.897246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.897426 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:41Z","lastTransitionTime":"2025-10-08T21:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.901035 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.923518 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"2025-10-08T21:48:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af\\\\n2025-10-08T21:48:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af to /host/opt/cni/bin/\\\\n2025-10-08T21:48:54Z [verbose] multus-daemon started\\\\n2025-10-08T21:48:54Z [verbose] Readiness Indicator file check\\\\n2025-10-08T21:49:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.939553 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.958070 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:41 crc kubenswrapper[4739]: I1008 21:49:41.978680 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:41Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.000048 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.000093 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.000110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.000135 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.000288 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.004193 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7389dea-44c5-4d76-8bc7-4d5d64df24b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.025385 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.050943 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.068590 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.086401 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.104546 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.104830 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.104868 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.104945 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.105098 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.106038 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.127463 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.148730 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.164251 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.188251 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.208717 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.208773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.208790 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.208816 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.208835 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.220322 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.240777 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:42Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.311516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.311603 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.311626 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.311659 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.311683 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.414698 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.414782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.414806 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.414840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.414863 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.517762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.517805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.517816 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.517830 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.517840 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.621385 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.621461 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.621484 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.621515 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.621539 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.724857 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.724927 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.724948 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.724970 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.724988 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.821404 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.821607 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:42 crc kubenswrapper[4739]: E1008 21:49:42.821729 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.821798 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:42 crc kubenswrapper[4739]: E1008 21:49:42.821968 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:42 crc kubenswrapper[4739]: E1008 21:49:42.822025 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.827816 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.827862 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.827880 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.827903 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.827920 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.931343 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.931408 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.931435 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.931468 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:42 crc kubenswrapper[4739]: I1008 21:49:42.931493 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:42Z","lastTransitionTime":"2025-10-08T21:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.034902 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.034960 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.034977 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.035033 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.035051 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.139232 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.139295 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.139314 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.139341 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.139360 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.242582 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.243433 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.243526 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.243572 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.243601 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.347089 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.347260 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.347292 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.347330 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.347356 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.450065 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.450190 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.450211 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.450244 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.450268 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.553695 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.553765 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.553787 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.553816 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.553840 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.656141 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.656219 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.656235 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.656255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.656275 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.758852 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.758923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.758940 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.758962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.758979 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.821710 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:43 crc kubenswrapper[4739]: E1008 21:49:43.821897 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.861700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.861788 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.861805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.861826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.861843 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.965421 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.965492 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.965512 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.965537 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:43 crc kubenswrapper[4739]: I1008 21:49:43.965556 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:43Z","lastTransitionTime":"2025-10-08T21:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.068872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.068928 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.068943 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.068967 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.068985 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.172409 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.172491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.172510 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.172535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.172555 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.274800 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.274880 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.274899 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.274922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.274939 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.378431 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.378509 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.378532 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.378563 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.378586 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.481231 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.481306 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.481328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.481357 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.481380 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.584814 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.584873 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.584890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.584913 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.584930 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.684791 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.684938 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.685035 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:48.685002553 +0000 UTC m=+148.510388303 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.685049 4739 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.685128 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:48.685105315 +0000 UTC m=+148.510491095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.685213 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.685378 4739 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.685452 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:48.685435224 +0000 UTC m=+148.510821004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.687274 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.687323 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.687340 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.687362 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.687378 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.786000 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.786083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786288 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786318 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786337 4739 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786338 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786408 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:48.786388295 +0000 UTC m=+148.611774085 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786451 4739 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786480 4739 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.786685 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:48.786646521 +0000 UTC m=+148.612032311 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.790198 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.790277 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.790302 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.790331 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.790353 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.821182 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.821235 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.821197 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.821348 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.821412 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:44 crc kubenswrapper[4739]: E1008 21:49:44.821472 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.893648 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.893718 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.893736 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.893760 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.893777 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.996999 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.997517 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.997544 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.997567 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:44 crc kubenswrapper[4739]: I1008 21:49:44.997586 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:44Z","lastTransitionTime":"2025-10-08T21:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.100855 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.100922 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.100942 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.101018 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.101045 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.203955 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.204015 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.204032 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.204058 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.204077 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.307274 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.307331 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.307346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.307367 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.307382 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.410004 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.410102 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.410119 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.410171 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.410190 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.513928 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.513987 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.514005 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.514029 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.514046 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.616463 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.616520 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.616570 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.616596 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.616613 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.719894 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.719951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.719967 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.719988 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.720005 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.821279 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:45 crc kubenswrapper[4739]: E1008 21:49:45.821487 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.822765 4739 scope.go:117] "RemoveContainer" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.823394 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.823449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.823465 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.823488 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.823505 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.926064 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.926138 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.926194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.926234 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:45 crc kubenswrapper[4739]: I1008 21:49:45.926256 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:45Z","lastTransitionTime":"2025-10-08T21:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.029763 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.029836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.029858 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.029885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.029906 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.132889 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.132944 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.132962 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.132990 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.133009 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.237432 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.237488 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.237505 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.237527 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.237544 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.341400 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.341470 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.341494 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.341525 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.341550 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.445100 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.445180 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.445206 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.445226 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.445237 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.547327 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.547359 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.547369 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.547384 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.547395 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.650354 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.650390 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.650398 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.650412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.650423 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.752637 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.752682 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.752693 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.752709 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.752722 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.821561 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.821607 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:46 crc kubenswrapper[4739]: E1008 21:49:46.821683 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:46 crc kubenswrapper[4739]: E1008 21:49:46.821769 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.821874 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:46 crc kubenswrapper[4739]: E1008 21:49:46.821970 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.855503 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.855554 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.855566 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.855586 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.855598 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.959770 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.959840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.959856 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.959883 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:46 crc kubenswrapper[4739]: I1008 21:49:46.959900 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:46Z","lastTransitionTime":"2025-10-08T21:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.062729 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.062763 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.062773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.062788 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.062799 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.166242 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.166320 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.166344 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.166377 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.166400 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.269360 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.269388 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.269397 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.269410 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.269418 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.287211 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/3.log" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.287775 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/2.log" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.290562 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" exitCode=1 Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.290595 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.290623 4739 scope.go:117] "RemoveContainer" containerID="2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.291464 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:49:47 crc kubenswrapper[4739]: E1008 21:49:47.291647 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.306898 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.322029 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.334079 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.343475 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.356309 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.372061 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.372098 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.372110 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.372128 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.372140 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.376591 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:47Z\\\",\\\"message\\\":\\\"ID: UUIDName:}]\\\\nI1008 21:49:47.080023 6784 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.93\\\\\\\", Port:5000, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 21:49:47.080078 6784 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.389824 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.402737 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.430871 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.447048 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.460949 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.474615 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.474658 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.474670 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.474687 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.474700 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.479332 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"2025-10-08T21:48:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af\\\\n2025-10-08T21:48:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af to /host/opt/cni/bin/\\\\n2025-10-08T21:48:54Z [verbose] multus-daemon started\\\\n2025-10-08T21:48:54Z [verbose] Readiness Indicator file check\\\\n2025-10-08T21:49:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.490976 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.502973 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.517962 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.529723 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7389dea-44c5-4d76-8bc7-4d5d64df24b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.543220 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.559386 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.571683 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:47Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.578182 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.578249 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.578270 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.578300 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.578318 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.680923 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.681000 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.681023 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.681054 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.681080 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.784410 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.784485 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.784509 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.784536 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.784554 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.820860 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:47 crc kubenswrapper[4739]: E1008 21:49:47.821059 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.888777 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.888877 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.888897 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.888959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.888979 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.991505 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.991561 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.991581 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.991608 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:47 crc kubenswrapper[4739]: I1008 21:49:47.991625 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:47Z","lastTransitionTime":"2025-10-08T21:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.094854 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.094918 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.094934 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.094959 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.094977 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.198262 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.198329 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.198347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.198370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.198386 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.296912 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/3.log" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.300363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.300542 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.300755 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.300937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.301080 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.403602 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.403649 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.403662 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.403679 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.403691 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.506087 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.506127 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.506138 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.506192 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.506216 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.608813 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.608870 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.608892 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.608913 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.608930 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.631626 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.631683 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.631701 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.631726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.631746 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.650945 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.655831 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.655901 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.655920 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.655947 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.655966 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.677282 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.681582 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.681653 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.681736 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.681774 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.681799 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.704389 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.709246 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.709290 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.709305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.709326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.709341 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.728694 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.733542 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.733583 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.733594 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.733610 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.733622 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.753446 4739 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06df6f1f-7503-4cee-a52c-383dcfb4609d\\\",\\\"systemUUID\\\":\\\"e102a271-0593-45af-a6b3-5b473c7eebd3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:48Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.753623 4739 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.755478 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.755528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.755549 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.755572 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.755589 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.821578 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.821657 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.821590 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.821795 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.821898 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:48 crc kubenswrapper[4739]: E1008 21:49:48.822010 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.858695 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.858745 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.858756 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.858773 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.858788 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.961981 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.962055 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.962078 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.962106 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:48 crc kubenswrapper[4739]: I1008 21:49:48.962127 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:48Z","lastTransitionTime":"2025-10-08T21:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.064676 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.064728 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.064745 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.064767 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.064784 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.167379 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.167454 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.167487 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.167517 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.167537 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.271135 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.271238 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.271256 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.271282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.271299 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.373809 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.373872 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.373888 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.373914 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.373933 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.476704 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.476824 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.476860 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.476907 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.476933 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.580919 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.580978 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.580994 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.581017 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.581078 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.683732 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.683794 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.683812 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.683837 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.683855 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.786231 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.786293 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.786309 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.786331 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.786348 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.821048 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:49 crc kubenswrapper[4739]: E1008 21:49:49.821255 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.889008 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.889050 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.889062 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.889078 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.889091 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.992624 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.992689 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.992701 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.992722 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:49 crc kubenswrapper[4739]: I1008 21:49:49.992735 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:49Z","lastTransitionTime":"2025-10-08T21:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.095695 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.095747 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.095759 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.095814 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.095828 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.199245 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.199310 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.199330 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.199354 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.199372 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.302813 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.302964 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.303026 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.303056 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.303074 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.406202 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.406273 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.406295 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.406326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.406354 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.509255 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.509313 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.509354 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.509384 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.509409 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.612612 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.612694 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.612719 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.612748 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.612771 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.716272 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.716380 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.716397 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.716424 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.716440 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.819498 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.819576 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.819598 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.819624 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.819645 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.820701 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.820763 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.820779 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:50 crc kubenswrapper[4739]: E1008 21:49:50.820858 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:50 crc kubenswrapper[4739]: E1008 21:49:50.821013 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:50 crc kubenswrapper[4739]: E1008 21:49:50.821186 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.922431 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.922494 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.922511 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.922535 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:50 crc kubenswrapper[4739]: I1008 21:49:50.922554 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:50Z","lastTransitionTime":"2025-10-08T21:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.025450 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.025491 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.025499 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.025513 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.025523 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.128722 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.128766 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.128782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.128804 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.128820 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.231613 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.231669 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.231687 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.231712 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.231732 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.334765 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.334847 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.334870 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.334910 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.334929 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.437449 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.437488 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.437501 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.437522 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.437545 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.540790 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.540853 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.540871 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.540896 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.540913 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.644083 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.644140 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.644186 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.644211 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.644229 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.747212 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.747288 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.747310 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.747340 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.747364 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.821361 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:51 crc kubenswrapper[4739]: E1008 21:49:51.822745 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.840939 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4310c6bc-4bce-4766-a147-e5a96daa3ae6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53812375b387a71344e95f90d4f961e9e9f97f4999db1b49ff9dd111c813a69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a7577ad29caeb9249292640d53bb0a8206e4cc6859c49b973f64aca1ae98ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89bfa18678c8c402652687bafbeda4ee94a4c581be3aac3286dee2fa1555c3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e0d8d10d62c8724a2940487ae2e6ec8e2260c313b9f2286730b563051af4ba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.851031 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.851134 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.851210 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.851240 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.851264 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.857806 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7389dea-44c5-4d76-8bc7-4d5d64df24b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a278298e9f47abde7822f1b805c6a3158710983862b6d47258e943c5dc02b6d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31424d88f9ebc081daff8ca0f9cbca6a1619d0db761a634eeb973b36cf7abd67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.877876 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.897014 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e56673121be92cb5663f088561d4924c6df7cb6f5ffefb0dd71d1efdca4bdc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fda9dad6f7fb791fe0e33063fc2c22fe56376d82f86a2e4c0b0eee0e35d4fc9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.914474 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bdb57654acd2433f2ee44a33b32c2411153687bbcbfb1e7743e070c9e36a1da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.932853 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9707b708-016c-4e06-86db-0332e2ca37db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec8c6b1474a27fc2f62348c05bf5db67869b581b1daefefd98d0c1771b99ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45972\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwvs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.953013 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c62d18-cfa9-4fa0-917e-5dc5bac085f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11bb432884e83cd38189b4d77b19e391ef8d9ba0c9504c419f50fc47cdbccaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2190e8189609681a8420d27217edc0e5d3b0f1ff8bfbe44661ff81cba4b5ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bb9f93bd7f2a920b8eb93e86918302b65226838a4b99ec6615885e1bb260a17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ec8252faadf6a7eb40c909925a03ab1a1ed2973f4c9e9ef89014cbe3153b9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.954282 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.954350 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.954368 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.954395 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.954413 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:51Z","lastTransitionTime":"2025-10-08T21:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.973974 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:51 crc kubenswrapper[4739]: I1008 21:49:51.994120 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:51Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.010794 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jh2pw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b697a648-053d-4e99-97a9-620dd8397aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a5879041b49c693931dee4befcf04f06dfcc2834487b9ef650081f16f40325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jh2pw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.033680 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa1c7d32-034f-414c-9252-ca30f9d961ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76916f5c7302d177118116c8364b1fd6b755be091ce4da3593480d7cdb7ddcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://267fd4c97c3fecaa07d012a1ab42d404728c4b6cba77dff6a1db9a506cbaf1d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://365848b77ebb2f46bde5351ceded0d6cacb1ae16a80cf6be407d49bf5ee29111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de4db3e998f57a23840defbf7b2920e387c1151b6ca4a1bd891e636ac527879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c516856f0b251ad56af70ffbed34d4386447582f9e8b8f6c2dd83d1c67b057b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1008 21:48:25.393659 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 21:48:25.397881 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2502808576/tls.crt::/tmp/serving-cert-2502808576/tls.key\\\\\\\"\\\\nI1008 21:48:40.380903 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 21:48:40.383358 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 21:48:40.383374 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 21:48:40.383391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 21:48:40.383396 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1008 21:48:40.392925 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1008 21:48:40.392950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1008 21:48:40.392971 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392977 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1008 21:48:40.392981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1008 21:48:40.392985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1008 21:48:40.392989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1008 21:48:40.393007 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1008 21:48:40.395287 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f6e588f46b11f23e5ee02720cf7435bda75b14b9ef8005f2e97d5f6ea298a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ec5537cff63e79304556d67b81bf4efb194776e91d4b7c51c1d906077de922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.056784 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.056833 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.056848 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.056873 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.056889 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.074116 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c6641d9-9ccf-42aa-8a83-c52d850aa766\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d8bdaa74dcda5aa06db68419e215a652d096276e2fda19194ca7463283ba038\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:19Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 21:49:19.695004 6389 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1008 21:49:19.695051 6389 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1008 21:49:19.695057 6389 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1008 21:49:19.695079 6389 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 21:49:19.695108 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 21:49:19.695123 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1008 21:49:19.695127 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1008 21:49:19.695186 6389 factory.go:656] Stopping watch factory\\\\nI1008 21:49:19.695214 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1008 21:49:19.695222 6389 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1008 21:49:19.695229 6389 handler.go:208] Removed *v1.Node event handler 2\\\\nI1008 21:49:19.695237 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1008 21:49:19.695244 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 21:49:19.695253 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 21:49:19.695259 6389 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:47Z\\\",\\\"message\\\":\\\"ID: UUIDName:}]\\\\nI1008 21:49:47.080023 6784 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"83c1e277-3d22-42ae-a355-f7a0ff0bd171\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-image-registry/image-registry_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.93\\\\\\\", Port:5000, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1008 21:49:47.080078 6784 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:49:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrwfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hfhrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.092950 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8629e121-2c64-4b46-adbd-ec1433ec0835\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgc9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kdt6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.125581 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf66f00f-0f40-4f3d-8c91-e089580a7f1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c207cfd4786dc700d21fd4a853cac064df569280f6fc01c060ba75b452aaefe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67afdf5fc10fb73b3f0b39f93604c78486b4f875161e5e18bb37bc2ccec6051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1ee5ff9cad700b0eb83a7e34da4e84a1d219942ceb0cdc4625933e60a65011c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2f2bff65aad2faae81efc40b168f33ebe1069f5865d7657f9763c9024bc45ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e077f0d455bc9a7687cd8491ebc499aea1b0b371dfc8d086cad435af6202ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44571fca28011f0123a7813a14d2a78308052d9d45ef835e72244474a85034a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d40a893a15d6aa38461d8611750a720c06f2066284104e5db2fed0b4d3c281\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98571f2547d9da0480604d77508c12e0dfa6df8064b68244ca887003db53bc60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.146580 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d212e6f2f3c608e216ac6d34ec4367492734cd9602e050d6121788e640b8922a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.159285 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.159330 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.159346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.159369 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.159426 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.198373 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6074d7a-f433-42bf-8c80-71963ba57484\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f7e4c3293c3a262fc8a264ac8325f498cd4b53b50a53040e2ff57d6e6b8140b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78650619e5caa59b1b0244de6d2c07a06658daa682cc48ad85deb9670f2eaed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9269202dafbae70a23058815a942127a2d4c91184e2359562e1a24a5b67e8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b56d0db87018c2583c83e2e17234c64ee436addef6975d93ddf8a4f31ecdafe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33da114ba174f1496ccae7709a723aabc0de8b73cda0d070eb7babc31be6db3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c2887ce10b884662ea3e6d866969dbe44eb6167ee85bf62f3563ebb2cfaa8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://767bda914b1e2157d19614fa35163df7b6af200f9ac9b98a0e11af100312ae4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T21:48:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkj9s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hjvjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.218684 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wwt88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17ed1d5a-5f21-4dcf-bdb9-09e715f57027\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T21:49:40Z\\\",\\\"message\\\":\\\"2025-10-08T21:48:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af\\\\n2025-10-08T21:48:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4600ae73-8bc0-4215-8ea3-86827f7556af to /host/opt/cni/bin/\\\\n2025-10-08T21:48:54Z [verbose] multus-daemon started\\\\n2025-10-08T21:48:54Z [verbose] Readiness Indicator file check\\\\n2025-10-08T21:49:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T21:48:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95gzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wwt88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.232756 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8p5bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3d9049-8f29-4235-8d91-e565cb0d157c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:48:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392091423b110eec80dbbfa9c863d035369d2ae18786c7e1464220f884ef9af8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:48:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nhxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:48:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8p5bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.249096 4739 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c9869b0-41e2-4d5b-9492-3067503ae6bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T21:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33c64c86690073c971b824a50e11615dc8efbe2426bfa418579302f876f03f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bc84dd29fc85f8f711e45387600b4be0c89514fa5f2bdf14e0773a55db3330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T21:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8xzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T21:49:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7dswk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T21:49:52Z is after 2025-08-24T17:21:41Z" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.262478 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.262528 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.262546 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.262569 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.262589 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.366054 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.366458 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.366622 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.366782 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.366915 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.469198 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.469273 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.469299 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.469325 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.469345 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.571808 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.571851 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.571867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.571890 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.571906 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.674099 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.674193 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.674213 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.674235 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.674252 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.777098 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.777197 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.777216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.777238 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.777254 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.821547 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.821680 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.821675 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:52 crc kubenswrapper[4739]: E1008 21:49:52.821895 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:52 crc kubenswrapper[4739]: E1008 21:49:52.822312 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:52 crc kubenswrapper[4739]: E1008 21:49:52.822517 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.881254 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.881316 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.881335 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.881365 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.881383 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.984322 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.984459 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.984486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.984516 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:52 crc kubenswrapper[4739]: I1008 21:49:52.984539 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:52Z","lastTransitionTime":"2025-10-08T21:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.087486 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.087547 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.087563 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.087589 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.087606 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.190673 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.190729 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.190749 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.190772 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.190790 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.294294 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.294359 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.294381 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.294412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.294433 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.397088 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.397211 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.397235 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.397265 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.397287 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.500683 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.500738 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.500751 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.500769 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.500782 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.606462 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.606840 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.606867 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.606892 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.606916 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.710296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.710347 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.710363 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.710385 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.710403 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.813720 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.813800 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.813818 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.813843 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.813861 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.821190 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:53 crc kubenswrapper[4739]: E1008 21:49:53.821769 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.917041 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.917104 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.917121 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.917176 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:53 crc kubenswrapper[4739]: I1008 21:49:53.917195 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:53Z","lastTransitionTime":"2025-10-08T21:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.020497 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.020544 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.020554 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.020573 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.020583 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.122885 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.122938 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.122954 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.122977 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.122996 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.225893 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.225965 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.225990 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.226020 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.226041 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.329216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.329287 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.329311 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.329340 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.329361 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.433439 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.433485 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.433496 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.433512 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.433525 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.537344 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.537421 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.537434 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.537452 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.537466 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.640045 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.640089 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.640098 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.640109 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.640123 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.742194 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.742263 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.742286 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.742318 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.742338 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.821001 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.821020 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.821040 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:54 crc kubenswrapper[4739]: E1008 21:49:54.822122 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:54 crc kubenswrapper[4739]: E1008 21:49:54.822312 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:54 crc kubenswrapper[4739]: E1008 21:49:54.822415 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.847129 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.847191 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.847203 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.847219 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.847231 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.949273 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.949317 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.949328 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.949343 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:54 crc kubenswrapper[4739]: I1008 21:49:54.949354 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:54Z","lastTransitionTime":"2025-10-08T21:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.052370 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.052415 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.052427 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.052444 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.052456 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.155081 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.155124 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.155135 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.155168 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.155181 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.257945 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.257993 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.258004 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.258020 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.258032 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.360330 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.360414 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.360431 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.360454 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.360470 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.463239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.463296 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.463320 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.463346 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.463368 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.567000 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.567118 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.567140 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.567204 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.567222 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.670701 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.670762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.670778 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.670800 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.670816 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.774069 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.774129 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.774207 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.774239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.774262 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.820948 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:55 crc kubenswrapper[4739]: E1008 21:49:55.821124 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.877086 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.877133 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.877195 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.877218 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.877235 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.980234 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.980305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.980326 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.980352 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:55 crc kubenswrapper[4739]: I1008 21:49:55.980369 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:55Z","lastTransitionTime":"2025-10-08T21:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.083946 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.084058 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.084083 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.084115 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.084134 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.192823 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.192896 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.192920 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.192951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.192975 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.295928 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.295986 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.296003 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.296028 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.296050 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.399700 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.399753 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.399765 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.399783 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.399795 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.503726 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.503789 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.503811 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.503838 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.503859 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.606895 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.607002 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.607026 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.607059 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.607080 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.709939 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.709967 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.709975 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.709987 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.709996 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.813305 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.813361 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.813383 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.813412 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.813435 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.820742 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.820868 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:56 crc kubenswrapper[4739]: E1008 21:49:56.821042 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.821114 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:56 crc kubenswrapper[4739]: E1008 21:49:56.821307 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:56 crc kubenswrapper[4739]: E1008 21:49:56.821428 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.916497 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.916545 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.916561 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.916584 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:56 crc kubenswrapper[4739]: I1008 21:49:56.916603 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:56Z","lastTransitionTime":"2025-10-08T21:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.019919 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.019985 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.020007 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.020038 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.020062 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.122658 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.122720 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.122737 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.122762 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.122779 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.225892 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.225945 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.225960 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.225985 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.226002 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.329216 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.329248 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.329260 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.329276 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.329288 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.432826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.432880 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.432891 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.432908 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.432925 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.534878 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.534909 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.534917 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.534929 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.534936 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.637358 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.637409 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.637421 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.637438 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.637450 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.738982 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.739021 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.739032 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.739050 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.739063 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.821168 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:57 crc kubenswrapper[4739]: E1008 21:49:57.821299 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.841066 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.841096 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.841103 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.841115 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.841125 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.943127 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.943199 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.943209 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.943222 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:57 crc kubenswrapper[4739]: I1008 21:49:57.943232 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:57Z","lastTransitionTime":"2025-10-08T21:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.045192 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.045228 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.045239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.045253 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.045263 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.147972 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.148024 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.148045 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.148065 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.148079 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.250507 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.250548 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.250558 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.250574 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.250585 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.352937 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.352979 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.352994 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.353013 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.353027 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.455820 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.455903 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.455929 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.455960 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.455982 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.558568 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.558627 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.558643 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.558664 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.558681 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.660963 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.661053 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.661083 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.661114 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.661136 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.764092 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.764202 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.764228 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.764254 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.764289 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.820844 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.820873 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.821444 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.821512 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:49:58 crc kubenswrapper[4739]: E1008 21:49:58.821583 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:49:58 crc kubenswrapper[4739]: E1008 21:49:58.821574 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:49:58 crc kubenswrapper[4739]: E1008 21:49:58.821680 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:49:58 crc kubenswrapper[4739]: E1008 21:49:58.821829 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.864010 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.863995328 podStartE2EDuration="1m18.863995328s" podCreationTimestamp="2025-10-08 21:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:58.861396664 +0000 UTC m=+98.686782454" watchObservedRunningTime="2025-10-08 21:49:58.863995328 +0000 UTC m=+98.689381078" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.871239 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.871289 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.871307 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.871327 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.871344 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.914559 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jh2pw" podStartSLOduration=72.914491774 podStartE2EDuration="1m12.914491774s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:58.914067664 +0000 UTC m=+98.739453464" watchObservedRunningTime="2025-10-08 21:49:58.914491774 +0000 UTC m=+98.739877564" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.930979 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.93095018 podStartE2EDuration="1m17.93095018s" podCreationTimestamp="2025-10-08 21:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:58.928944841 +0000 UTC m=+98.754330631" watchObservedRunningTime="2025-10-08 21:49:58.93095018 +0000 UTC m=+98.756335980" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.974951 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.975032 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.975058 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.975089 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.975113 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:58Z","lastTransitionTime":"2025-10-08T21:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:58 crc kubenswrapper[4739]: I1008 21:49:58.997103 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.997076171 podStartE2EDuration="1m17.997076171s" podCreationTimestamp="2025-10-08 21:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:58.996270942 +0000 UTC m=+98.821656702" watchObservedRunningTime="2025-10-08 21:49:58.997076171 +0000 UTC m=+98.822461971" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.052437 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hjvjs" podStartSLOduration=72.052415867 podStartE2EDuration="1m12.052415867s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.036489464 +0000 UTC m=+98.861875224" watchObservedRunningTime="2025-10-08 21:49:59.052415867 +0000 UTC m=+98.877801637" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.061832 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wwt88" podStartSLOduration=72.061816739 podStartE2EDuration="1m12.061816739s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.052804337 +0000 UTC m=+98.878190087" watchObservedRunningTime="2025-10-08 21:49:59.061816739 +0000 UTC m=+98.887202489" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.075180 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8p5bp" podStartSLOduration=73.075164088 podStartE2EDuration="1m13.075164088s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.062438294 +0000 UTC m=+98.887824044" watchObservedRunningTime="2025-10-08 21:49:59.075164088 +0000 UTC m=+98.900549838" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.076805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.076826 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.076836 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.076846 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.076855 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:59Z","lastTransitionTime":"2025-10-08T21:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.088691 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7dswk" podStartSLOduration=72.088672302 podStartE2EDuration="1m12.088672302s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.077468035 +0000 UTC m=+98.902853785" watchObservedRunningTime="2025-10-08 21:49:59.088672302 +0000 UTC m=+98.914058052" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.089262 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.089256716 podStartE2EDuration="43.089256716s" podCreationTimestamp="2025-10-08 21:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.088987489 +0000 UTC m=+98.914373259" watchObservedRunningTime="2025-10-08 21:49:59.089256716 +0000 UTC m=+98.914642466" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.099877 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.099862528 podStartE2EDuration="26.099862528s" podCreationTimestamp="2025-10-08 21:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.099731045 +0000 UTC m=+98.925116795" watchObservedRunningTime="2025-10-08 21:49:59.099862528 +0000 UTC m=+98.925248278" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.127756 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.127805 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.127820 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.127842 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.127855 4739 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T21:49:59Z","lastTransitionTime":"2025-10-08T21:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.148608 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podStartSLOduration=72.14858919 podStartE2EDuration="1m12.14858919s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:49:59.148368464 +0000 UTC m=+98.973754234" watchObservedRunningTime="2025-10-08 21:49:59.14858919 +0000 UTC m=+98.973974950" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.168485 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2"] Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.168910 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.170885 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.170895 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.171266 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.172777 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.275214 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.275264 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.275282 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.275297 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.275366 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376691 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376737 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376759 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376782 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376803 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376973 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.376979 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.377638 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.384643 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.393210 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e84f09bd-54b7-44bc-8cf2-2f6ca54ee356-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kl9v2\" (UID: \"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.488638 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" Oct 08 21:49:59 crc kubenswrapper[4739]: I1008 21:49:59.821247 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:49:59 crc kubenswrapper[4739]: E1008 21:49:59.821648 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.342493 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" event={"ID":"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356","Type":"ContainerStarted","Data":"896ca8234b5be283cc436d85155a7f6d6be785844752354e2b8f8b0ee8f2573b"} Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.342550 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" event={"ID":"e84f09bd-54b7-44bc-8cf2-2f6ca54ee356","Type":"ContainerStarted","Data":"2870bc943425bec1e7c67ddcaa5f90c8f0e3a096fcc9ec4fb13001950a5e6cac"} Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.355557 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kl9v2" podStartSLOduration=74.355535029 podStartE2EDuration="1m14.355535029s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:00.355286223 +0000 UTC m=+100.180672013" watchObservedRunningTime="2025-10-08 21:50:00.355535029 +0000 UTC m=+100.180920789" Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.820881 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.820914 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:00 crc kubenswrapper[4739]: I1008 21:50:00.820924 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:00 crc kubenswrapper[4739]: E1008 21:50:00.821038 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:00 crc kubenswrapper[4739]: E1008 21:50:00.821354 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:00 crc kubenswrapper[4739]: E1008 21:50:00.821792 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:01 crc kubenswrapper[4739]: I1008 21:50:01.820703 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:01 crc kubenswrapper[4739]: E1008 21:50:01.828298 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:02 crc kubenswrapper[4739]: I1008 21:50:02.821365 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:02 crc kubenswrapper[4739]: E1008 21:50:02.821451 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:02 crc kubenswrapper[4739]: I1008 21:50:02.821533 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:02 crc kubenswrapper[4739]: I1008 21:50:02.821375 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:02 crc kubenswrapper[4739]: E1008 21:50:02.821883 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:02 crc kubenswrapper[4739]: E1008 21:50:02.821994 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:03 crc kubenswrapper[4739]: I1008 21:50:03.051838 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:50:03 crc kubenswrapper[4739]: I1008 21:50:03.052797 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:50:03 crc kubenswrapper[4739]: E1008 21:50:03.052978 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:50:03 crc kubenswrapper[4739]: I1008 21:50:03.821593 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:03 crc kubenswrapper[4739]: E1008 21:50:03.821754 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:04 crc kubenswrapper[4739]: I1008 21:50:04.821701 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:04 crc kubenswrapper[4739]: I1008 21:50:04.821728 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:04 crc kubenswrapper[4739]: E1008 21:50:04.822479 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:04 crc kubenswrapper[4739]: E1008 21:50:04.822633 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:04 crc kubenswrapper[4739]: I1008 21:50:04.821777 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:04 crc kubenswrapper[4739]: E1008 21:50:04.822753 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:05 crc kubenswrapper[4739]: I1008 21:50:05.643329 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:05 crc kubenswrapper[4739]: E1008 21:50:05.643532 4739 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:50:05 crc kubenswrapper[4739]: E1008 21:50:05.643606 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs podName:8629e121-2c64-4b46-adbd-ec1433ec0835 nodeName:}" failed. No retries permitted until 2025-10-08 21:51:09.643586674 +0000 UTC m=+169.468972424 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs") pod "network-metrics-daemon-kdt6j" (UID: "8629e121-2c64-4b46-adbd-ec1433ec0835") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 21:50:05 crc kubenswrapper[4739]: I1008 21:50:05.821087 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:05 crc kubenswrapper[4739]: E1008 21:50:05.821337 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:06 crc kubenswrapper[4739]: I1008 21:50:06.820773 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:06 crc kubenswrapper[4739]: I1008 21:50:06.820836 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:06 crc kubenswrapper[4739]: I1008 21:50:06.820797 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:06 crc kubenswrapper[4739]: E1008 21:50:06.821003 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:06 crc kubenswrapper[4739]: E1008 21:50:06.821077 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:06 crc kubenswrapper[4739]: E1008 21:50:06.821199 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:07 crc kubenswrapper[4739]: I1008 21:50:07.821665 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:07 crc kubenswrapper[4739]: E1008 21:50:07.822064 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:08 crc kubenswrapper[4739]: I1008 21:50:08.821460 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:08 crc kubenswrapper[4739]: I1008 21:50:08.821532 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:08 crc kubenswrapper[4739]: E1008 21:50:08.821655 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:08 crc kubenswrapper[4739]: I1008 21:50:08.821659 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:08 crc kubenswrapper[4739]: E1008 21:50:08.821670 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:08 crc kubenswrapper[4739]: E1008 21:50:08.821954 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:09 crc kubenswrapper[4739]: I1008 21:50:09.820898 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:09 crc kubenswrapper[4739]: E1008 21:50:09.821019 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:10 crc kubenswrapper[4739]: I1008 21:50:10.821307 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:10 crc kubenswrapper[4739]: I1008 21:50:10.821341 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:10 crc kubenswrapper[4739]: I1008 21:50:10.821350 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:10 crc kubenswrapper[4739]: E1008 21:50:10.821485 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:10 crc kubenswrapper[4739]: E1008 21:50:10.821673 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:10 crc kubenswrapper[4739]: E1008 21:50:10.821830 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:11 crc kubenswrapper[4739]: I1008 21:50:11.820713 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:11 crc kubenswrapper[4739]: E1008 21:50:11.821712 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:12 crc kubenswrapper[4739]: I1008 21:50:12.821571 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:12 crc kubenswrapper[4739]: I1008 21:50:12.821633 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:12 crc kubenswrapper[4739]: I1008 21:50:12.821598 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:12 crc kubenswrapper[4739]: E1008 21:50:12.821764 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:12 crc kubenswrapper[4739]: E1008 21:50:12.821996 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:12 crc kubenswrapper[4739]: E1008 21:50:12.822138 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:13 crc kubenswrapper[4739]: I1008 21:50:13.821673 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:13 crc kubenswrapper[4739]: E1008 21:50:13.821830 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:14 crc kubenswrapper[4739]: I1008 21:50:14.820675 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:14 crc kubenswrapper[4739]: I1008 21:50:14.820725 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:14 crc kubenswrapper[4739]: E1008 21:50:14.820810 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:14 crc kubenswrapper[4739]: I1008 21:50:14.820882 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:14 crc kubenswrapper[4739]: E1008 21:50:14.821072 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:14 crc kubenswrapper[4739]: E1008 21:50:14.821250 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:15 crc kubenswrapper[4739]: I1008 21:50:15.821501 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:15 crc kubenswrapper[4739]: E1008 21:50:15.822191 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:16 crc kubenswrapper[4739]: I1008 21:50:16.821436 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:16 crc kubenswrapper[4739]: I1008 21:50:16.821451 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:16 crc kubenswrapper[4739]: I1008 21:50:16.821732 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:16 crc kubenswrapper[4739]: E1008 21:50:16.821617 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:16 crc kubenswrapper[4739]: E1008 21:50:16.821817 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:16 crc kubenswrapper[4739]: E1008 21:50:16.821903 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:17 crc kubenswrapper[4739]: I1008 21:50:17.821504 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:17 crc kubenswrapper[4739]: E1008 21:50:17.821689 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:17 crc kubenswrapper[4739]: I1008 21:50:17.822946 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:50:17 crc kubenswrapper[4739]: E1008 21:50:17.823212 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hfhrc_openshift-ovn-kubernetes(4c6641d9-9ccf-42aa-8a83-c52d850aa766)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" Oct 08 21:50:18 crc kubenswrapper[4739]: I1008 21:50:18.821201 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:18 crc kubenswrapper[4739]: I1008 21:50:18.821235 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:18 crc kubenswrapper[4739]: E1008 21:50:18.821676 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:18 crc kubenswrapper[4739]: I1008 21:50:18.821407 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:18 crc kubenswrapper[4739]: E1008 21:50:18.821918 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:18 crc kubenswrapper[4739]: E1008 21:50:18.822040 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:19 crc kubenswrapper[4739]: I1008 21:50:19.821281 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:19 crc kubenswrapper[4739]: E1008 21:50:19.821550 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:20 crc kubenswrapper[4739]: I1008 21:50:20.821047 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:20 crc kubenswrapper[4739]: I1008 21:50:20.821202 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:20 crc kubenswrapper[4739]: E1008 21:50:20.821283 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:20 crc kubenswrapper[4739]: I1008 21:50:20.821047 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:20 crc kubenswrapper[4739]: E1008 21:50:20.821415 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:20 crc kubenswrapper[4739]: E1008 21:50:20.821555 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:21 crc kubenswrapper[4739]: I1008 21:50:21.821047 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:21 crc kubenswrapper[4739]: E1008 21:50:21.822778 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:21 crc kubenswrapper[4739]: E1008 21:50:21.854136 4739 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 08 21:50:22 crc kubenswrapper[4739]: E1008 21:50:22.039921 4739 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:50:22 crc kubenswrapper[4739]: I1008 21:50:22.821299 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:22 crc kubenswrapper[4739]: I1008 21:50:22.821426 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:22 crc kubenswrapper[4739]: E1008 21:50:22.821922 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:22 crc kubenswrapper[4739]: I1008 21:50:22.821446 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:22 crc kubenswrapper[4739]: E1008 21:50:22.822121 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:22 crc kubenswrapper[4739]: E1008 21:50:22.822298 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:23 crc kubenswrapper[4739]: I1008 21:50:23.821655 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:23 crc kubenswrapper[4739]: E1008 21:50:23.821849 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:24 crc kubenswrapper[4739]: I1008 21:50:24.821216 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:24 crc kubenswrapper[4739]: E1008 21:50:24.821359 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:24 crc kubenswrapper[4739]: I1008 21:50:24.821834 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:24 crc kubenswrapper[4739]: I1008 21:50:24.821893 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:24 crc kubenswrapper[4739]: E1008 21:50:24.822702 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:24 crc kubenswrapper[4739]: E1008 21:50:24.822934 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:25 crc kubenswrapper[4739]: I1008 21:50:25.820710 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:25 crc kubenswrapper[4739]: E1008 21:50:25.820922 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.429778 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/1.log" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.430335 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/0.log" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.430387 4739 generic.go:334] "Generic (PLEG): container finished" podID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" containerID="9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea" exitCode=1 Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.430426 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerDied","Data":"9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea"} Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.430479 4739 scope.go:117] "RemoveContainer" containerID="d18be3be9ee988fe9bb0de635f9c7a331785e4bc7c21f210c450360e5c35ed58" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.430882 4739 scope.go:117] "RemoveContainer" containerID="9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea" Oct 08 21:50:26 crc kubenswrapper[4739]: E1008 21:50:26.431052 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wwt88_openshift-multus(17ed1d5a-5f21-4dcf-bdb9-09e715f57027)\"" pod="openshift-multus/multus-wwt88" podUID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.820721 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.820778 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:26 crc kubenswrapper[4739]: I1008 21:50:26.820842 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:26 crc kubenswrapper[4739]: E1008 21:50:26.820965 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:26 crc kubenswrapper[4739]: E1008 21:50:26.821087 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:26 crc kubenswrapper[4739]: E1008 21:50:26.821276 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:27 crc kubenswrapper[4739]: E1008 21:50:27.041404 4739 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:50:27 crc kubenswrapper[4739]: I1008 21:50:27.435681 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/1.log" Oct 08 21:50:27 crc kubenswrapper[4739]: I1008 21:50:27.821713 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:27 crc kubenswrapper[4739]: E1008 21:50:27.821802 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:28 crc kubenswrapper[4739]: I1008 21:50:28.821044 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:28 crc kubenswrapper[4739]: E1008 21:50:28.821313 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:28 crc kubenswrapper[4739]: I1008 21:50:28.821340 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:28 crc kubenswrapper[4739]: I1008 21:50:28.821585 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:28 crc kubenswrapper[4739]: E1008 21:50:28.821712 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:28 crc kubenswrapper[4739]: E1008 21:50:28.821928 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:29 crc kubenswrapper[4739]: I1008 21:50:29.821166 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:29 crc kubenswrapper[4739]: E1008 21:50:29.821285 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:30 crc kubenswrapper[4739]: I1008 21:50:30.820723 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:30 crc kubenswrapper[4739]: I1008 21:50:30.820794 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:30 crc kubenswrapper[4739]: I1008 21:50:30.820886 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:30 crc kubenswrapper[4739]: E1008 21:50:30.821010 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:30 crc kubenswrapper[4739]: E1008 21:50:30.820923 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:30 crc kubenswrapper[4739]: E1008 21:50:30.821368 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:31 crc kubenswrapper[4739]: I1008 21:50:31.820979 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:31 crc kubenswrapper[4739]: E1008 21:50:31.822952 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:32 crc kubenswrapper[4739]: E1008 21:50:32.042424 4739 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:50:32 crc kubenswrapper[4739]: I1008 21:50:32.821277 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:32 crc kubenswrapper[4739]: I1008 21:50:32.821312 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:32 crc kubenswrapper[4739]: E1008 21:50:32.821756 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:32 crc kubenswrapper[4739]: I1008 21:50:32.821809 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:32 crc kubenswrapper[4739]: E1008 21:50:32.821919 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:32 crc kubenswrapper[4739]: E1008 21:50:32.822282 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:32 crc kubenswrapper[4739]: I1008 21:50:32.822565 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.463814 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/3.log" Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.466753 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerStarted","Data":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.467773 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.821006 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:33 crc kubenswrapper[4739]: E1008 21:50:33.821131 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.875663 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podStartSLOduration=106.875643544 podStartE2EDuration="1m46.875643544s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:33.499128951 +0000 UTC m=+133.324514701" watchObservedRunningTime="2025-10-08 21:50:33.875643544 +0000 UTC m=+133.701029294" Oct 08 21:50:33 crc kubenswrapper[4739]: I1008 21:50:33.875900 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdt6j"] Oct 08 21:50:34 crc kubenswrapper[4739]: I1008 21:50:34.473588 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:34 crc kubenswrapper[4739]: E1008 21:50:34.474188 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:34 crc kubenswrapper[4739]: I1008 21:50:34.821635 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:34 crc kubenswrapper[4739]: E1008 21:50:34.821837 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:34 crc kubenswrapper[4739]: I1008 21:50:34.821917 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:34 crc kubenswrapper[4739]: E1008 21:50:34.822009 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:34 crc kubenswrapper[4739]: I1008 21:50:34.822446 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:34 crc kubenswrapper[4739]: E1008 21:50:34.822678 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:35 crc kubenswrapper[4739]: I1008 21:50:35.820911 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:35 crc kubenswrapper[4739]: E1008 21:50:35.821306 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:36 crc kubenswrapper[4739]: I1008 21:50:36.821133 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:36 crc kubenswrapper[4739]: I1008 21:50:36.821318 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:36 crc kubenswrapper[4739]: I1008 21:50:36.821393 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:36 crc kubenswrapper[4739]: E1008 21:50:36.821525 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:36 crc kubenswrapper[4739]: E1008 21:50:36.821668 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:36 crc kubenswrapper[4739]: E1008 21:50:36.821706 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:37 crc kubenswrapper[4739]: E1008 21:50:37.044281 4739 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:50:37 crc kubenswrapper[4739]: I1008 21:50:37.821391 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:37 crc kubenswrapper[4739]: E1008 21:50:37.822407 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:37 crc kubenswrapper[4739]: I1008 21:50:37.823380 4739 scope.go:117] "RemoveContainer" containerID="9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea" Oct 08 21:50:38 crc kubenswrapper[4739]: I1008 21:50:38.494524 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/1.log" Oct 08 21:50:38 crc kubenswrapper[4739]: I1008 21:50:38.494616 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerStarted","Data":"93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88"} Oct 08 21:50:38 crc kubenswrapper[4739]: I1008 21:50:38.821040 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:38 crc kubenswrapper[4739]: E1008 21:50:38.821282 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:38 crc kubenswrapper[4739]: I1008 21:50:38.821062 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:38 crc kubenswrapper[4739]: E1008 21:50:38.821416 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:38 crc kubenswrapper[4739]: I1008 21:50:38.821053 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:38 crc kubenswrapper[4739]: E1008 21:50:38.821522 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:39 crc kubenswrapper[4739]: I1008 21:50:39.820851 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:39 crc kubenswrapper[4739]: E1008 21:50:39.821074 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:40 crc kubenswrapper[4739]: I1008 21:50:40.821514 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:40 crc kubenswrapper[4739]: E1008 21:50:40.821729 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 21:50:40 crc kubenswrapper[4739]: I1008 21:50:40.821530 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:40 crc kubenswrapper[4739]: E1008 21:50:40.822131 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 21:50:40 crc kubenswrapper[4739]: I1008 21:50:40.822352 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:40 crc kubenswrapper[4739]: E1008 21:50:40.822547 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 21:50:41 crc kubenswrapper[4739]: I1008 21:50:41.821510 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:41 crc kubenswrapper[4739]: E1008 21:50:41.824834 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdt6j" podUID="8629e121-2c64-4b46-adbd-ec1433ec0835" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.821322 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.821437 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.821372 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.824661 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.824782 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.824663 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 08 21:50:42 crc kubenswrapper[4739]: I1008 21:50:42.825069 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 08 21:50:43 crc kubenswrapper[4739]: I1008 21:50:43.821580 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:50:43 crc kubenswrapper[4739]: I1008 21:50:43.824855 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 08 21:50:43 crc kubenswrapper[4739]: I1008 21:50:43.825041 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.731095 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:48 crc kubenswrapper[4739]: E1008 21:50:48.731315 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:52:50.731277781 +0000 UTC m=+270.556663571 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.731403 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.731561 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.732712 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.748047 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.832627 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.832693 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.838375 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.838600 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.848927 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.864918 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 21:50:48 crc kubenswrapper[4739]: I1008 21:50:48.878528 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:49 crc kubenswrapper[4739]: W1008 21:50:49.196189 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-b52c763dfb046a83dbfd92212e81fc74f2b0ba1c66314f6f0c699037a25f748e WatchSource:0}: Error finding container b52c763dfb046a83dbfd92212e81fc74f2b0ba1c66314f6f0c699037a25f748e: Status 404 returned error can't find the container with id b52c763dfb046a83dbfd92212e81fc74f2b0ba1c66314f6f0c699037a25f748e Oct 08 21:50:49 crc kubenswrapper[4739]: W1008 21:50:49.205411 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-38f92120bace90e5ee3e5ae2cb58428429cbdf4b6960a0de85d5bb266e3219e1 WatchSource:0}: Error finding container 38f92120bace90e5ee3e5ae2cb58428429cbdf4b6960a0de85d5bb266e3219e1: Status 404 returned error can't find the container with id 38f92120bace90e5ee3e5ae2cb58428429cbdf4b6960a0de85d5bb266e3219e1 Oct 08 21:50:49 crc kubenswrapper[4739]: W1008 21:50:49.217653 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-dd0297a785cba6d3bb47ee5e6ab37f8370ad9a3ed002fcf8e943dd536327acbc WatchSource:0}: Error finding container dd0297a785cba6d3bb47ee5e6ab37f8370ad9a3ed002fcf8e943dd536327acbc: Status 404 returned error can't find the container with id dd0297a785cba6d3bb47ee5e6ab37f8370ad9a3ed002fcf8e943dd536327acbc Oct 08 21:50:49 crc kubenswrapper[4739]: I1008 21:50:49.539493 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"38f92120bace90e5ee3e5ae2cb58428429cbdf4b6960a0de85d5bb266e3219e1"} Oct 08 21:50:49 crc kubenswrapper[4739]: I1008 21:50:49.540971 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dd0297a785cba6d3bb47ee5e6ab37f8370ad9a3ed002fcf8e943dd536327acbc"} Oct 08 21:50:49 crc kubenswrapper[4739]: I1008 21:50:49.542622 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b52c763dfb046a83dbfd92212e81fc74f2b0ba1c66314f6f0c699037a25f748e"} Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.123303 4739 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.224358 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.224827 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dv7wz"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.225201 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.225531 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.229280 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.229614 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236370 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236568 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236640 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236675 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236640 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236849 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236917 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.236916 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.237001 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.237119 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.238349 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.238469 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.240391 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.240798 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.262441 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.262641 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.262801 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.263002 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.263181 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.263363 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264074 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264323 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264427 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264561 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264664 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264785 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.264921 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.265068 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.265705 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.265947 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.266077 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.266476 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.266886 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.266962 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dv7wz"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.267559 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.267686 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.269894 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.270473 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.271201 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.271232 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.271769 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.274157 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.274808 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.277326 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm4g4"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.277666 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.278286 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.278816 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.278944 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.279543 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.279908 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.280129 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.280667 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.294400 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.294933 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-v296d"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.295307 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.295525 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.295826 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.295849 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.296090 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.296491 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.297470 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.299280 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.314904 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.315350 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.315700 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.316094 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.316772 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.319524 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.319955 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-klt8b"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.320255 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.320370 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.321097 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329130 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329258 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329447 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329484 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329565 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329582 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lt4w6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329718 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329768 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329913 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330004 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329935 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330091 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zrm4p"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.329974 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330210 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330240 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330400 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330406 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330434 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330480 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330654 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.330908 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.331024 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.331213 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.331303 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.331461 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.331600 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.332112 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.332218 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334582 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334679 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334742 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334853 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334881 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.334855 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.335000 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.335061 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.336218 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-884f6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.336720 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.337326 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.337396 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qdvnh"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.337889 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.337919 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.339218 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.339592 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.339625 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.340020 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.340634 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.340725 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.340755 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.340894 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.341000 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.341015 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.341258 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.345709 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.345875 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.346585 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.346749 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.347420 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.348784 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349345 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7vm\" (UniqueName: \"kubernetes.io/projected/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-kube-api-access-hv7vm\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349430 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349507 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349590 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349663 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-node-pullsecrets\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349748 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-image-import-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349818 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jxnc\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-kube-api-access-9jxnc\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349906 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349974 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-dir\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350041 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzmdg\" (UniqueName: \"kubernetes.io/projected/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-kube-api-access-mzmdg\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350112 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350205 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350285 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-client\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350350 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-client\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350419 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350483 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350569 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350647 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2155710-7d43-4b2b-b619-cedb531ea612-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350723 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350805 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-policies\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350874 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.350943 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-serving-cert\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351016 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-serving-cert\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351084 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnsbp\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-kube-api-access-mnsbp\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351169 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2155710-7d43-4b2b-b619-cedb531ea612-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351240 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351317 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-encryption-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351383 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4dg\" (UniqueName: \"kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351516 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351581 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351652 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a3ae274-744f-43f4-a579-a11a36b3eee7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351721 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2q4\" (UniqueName: \"kubernetes.io/projected/ce03f742-2910-4d6c-af9b-97abf28c6fbc-kube-api-access-rz2q4\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351786 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-config\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351852 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-serving-cert\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.351992 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352062 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.349368 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352226 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwl5j\" (UniqueName: \"kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352268 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352310 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit-dir\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352334 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2krm\" (UniqueName: \"kubernetes.io/projected/7a3ae274-744f-43f4-a579-a11a36b3eee7-kube-api-access-m2krm\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352350 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3ae274-744f-43f4-a579-a11a36b3eee7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352380 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-encryption-config\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352190 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.352540 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.353401 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.354291 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.355056 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.355611 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.356243 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.368507 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.375957 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.376627 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.381540 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.381953 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.385664 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.386229 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.386293 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.386507 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.388109 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.388581 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.390046 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.394637 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.395108 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.396030 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.396638 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.397665 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dcmnq"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.398326 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.399047 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rkgxg"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.407988 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.408228 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.409767 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.410246 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.410704 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.410847 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.411367 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.411885 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.413487 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.413851 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.414016 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.415207 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.415235 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lngv"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.415466 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.416078 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pc6pd"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.416199 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.416949 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.417097 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.417647 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8679r"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.417764 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.418748 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.418915 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.419356 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.420697 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.421291 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.422163 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.423767 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.425319 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.427279 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.427286 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.428879 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.430535 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.432370 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm4g4"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.433764 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-klt8b"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.436277 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lt4w6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.437859 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.438862 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-884f6"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.440644 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.441932 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qdvnh"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.443071 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.444249 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.445499 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8679r"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.448319 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dcmnq"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.450795 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.450960 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453286 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-node-pullsecrets\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453321 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-image-import-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453346 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jxnc\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-kube-api-access-9jxnc\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453370 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453396 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24rq\" (UniqueName: \"kubernetes.io/projected/44d375d6-9c45-466a-a423-b87b33bc63dd-kube-api-access-d24rq\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453431 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3738927-508b-431d-b0a9-8762d261f7f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453546 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-node-pullsecrets\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453577 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd2m7\" (UniqueName: \"kubernetes.io/projected/5205eeb0-f3c3-460a-8512-0af719bbf18a-kube-api-access-hd2m7\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453686 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-dir\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453716 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzmdg\" (UniqueName: \"kubernetes.io/projected/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-kube-api-access-mzmdg\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453737 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453785 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453814 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-client\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.453811 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-dir\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.454270 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.454566 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.454854 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-image-import-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.454078 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-client\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455350 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455427 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455454 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2155710-7d43-4b2b-b619-cedb531ea612-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455508 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0840280-c534-4e58-9095-a87e9acb799a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455546 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455606 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-stats-auth\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455644 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-config\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455684 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kpr\" (UniqueName: \"kubernetes.io/projected/a0840280-c534-4e58-9095-a87e9acb799a-kube-api-access-n2kpr\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455725 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-default-certificate\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455764 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-policies\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455788 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455813 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455835 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.455857 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456116 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-serving-cert\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456136 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-serving-cert\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456183 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngvt\" (UniqueName: \"kubernetes.io/projected/fac034e3-015b-4ac2-a698-9af0ac978f30-kube-api-access-zngvt\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456220 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnsbp\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-kube-api-access-mnsbp\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456245 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456269 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2155710-7d43-4b2b-b619-cedb531ea612-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456289 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456316 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-encryption-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456342 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456366 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456385 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456406 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d375d6-9c45-466a-a423-b87b33bc63dd-service-ca-bundle\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456430 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzb6\" (UniqueName: \"kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456457 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7q6\" (UniqueName: \"kubernetes.io/projected/a454f893-33d3-4b64-af77-016e2bef05f2-kube-api-access-xz7q6\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456480 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4dg\" (UniqueName: \"kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456502 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456521 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-auth-proxy-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456541 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456563 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456583 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a3ae274-744f-43f4-a579-a11a36b3eee7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456615 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456645 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456673 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456698 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2q4\" (UniqueName: \"kubernetes.io/projected/ce03f742-2910-4d6c-af9b-97abf28c6fbc-kube-api-access-rz2q4\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456719 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-config\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456722 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.457273 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456740 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwpc\" (UniqueName: \"kubernetes.io/projected/c3738927-508b-431d-b0a9-8762d261f7f5-kube-api-access-shwpc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.457600 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.457740 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.456735 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.458464 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-audit-policies\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.458604 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lngv"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.458651 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.459327 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a3ae274-744f-43f4-a579-a11a36b3eee7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.459558 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-etcd-client\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.459572 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-etcd-client\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.459775 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.460366 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-config\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.460380 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce03f742-2910-4d6c-af9b-97abf28c6fbc-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.460611 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a454f893-33d3-4b64-af77-016e2bef05f2-machine-approver-tls\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.460641 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.460715 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-serving-cert\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.461681 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2155710-7d43-4b2b-b619-cedb531ea612-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.461810 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.461920 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.461984 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.462004 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463160 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-encryption-config\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463211 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463210 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce03f742-2910-4d6c-af9b-97abf28c6fbc-serving-cert\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463345 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwl5j\" (UniqueName: \"kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463351 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-serving-cert\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463414 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463532 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463528 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463620 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit-dir\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463686 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2krm\" (UniqueName: \"kubernetes.io/projected/7a3ae274-744f-43f4-a579-a11a36b3eee7-kube-api-access-m2krm\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463753 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce03f742-2910-4d6c-af9b-97abf28c6fbc-audit-dir\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.463793 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3ae274-744f-43f4-a579-a11a36b3eee7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.464022 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-encryption-config\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.464170 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7vm\" (UniqueName: \"kubernetes.io/projected/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-kube-api-access-hv7vm\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.464228 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.464266 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-metrics-certs\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.464866 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-serving-cert\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.465134 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3738927-508b-431d-b0a9-8762d261f7f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.466113 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.466203 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.466240 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprtp\" (UniqueName: \"kubernetes.io/projected/5a29832f-5f50-4993-824c-50afa0213fb8-kube-api-access-mprtp\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.466368 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a3ae274-744f-43f4-a579-a11a36b3eee7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.466428 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.467884 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.468206 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.468681 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.468909 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.470111 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.470212 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.470275 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-encryption-config\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.470406 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.471669 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.471784 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.472558 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2155710-7d43-4b2b-b619-cedb531ea612-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.472767 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bdt5r"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.473554 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.473979 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bdrcp"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.475039 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.475233 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bdrcp"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.476522 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.477871 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pc6pd"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.479077 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.480232 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.481387 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.482377 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.483493 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rkgxg"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.484586 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.486003 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.486876 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx"] Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.488547 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.507401 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.527415 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.545995 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7bc2bbeb16f1990e331778ee0cb30f96323cd342e96e925d05f10782cdaf809b"} Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.546764 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.548224 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bac2bcbaf26d2e87f2d9500197bd65625ed77ec22620941e73001c830d7ea9b0"} Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.548502 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.549442 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e36bb5f4dc1777ef772569a1216ae9fbb62d68056ccf4b5d25682d7e85628281"} Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567046 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567085 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-config\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567110 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kpr\" (UniqueName: \"kubernetes.io/projected/a0840280-c534-4e58-9095-a87e9acb799a-kube-api-access-n2kpr\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567130 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-default-certificate\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567172 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567242 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567273 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567328 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567350 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zngvt\" (UniqueName: \"kubernetes.io/projected/fac034e3-015b-4ac2-a698-9af0ac978f30-kube-api-access-zngvt\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567423 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567445 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567462 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d375d6-9c45-466a-a423-b87b33bc63dd-service-ca-bundle\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567503 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7q6\" (UniqueName: \"kubernetes.io/projected/a454f893-33d3-4b64-af77-016e2bef05f2-kube-api-access-xz7q6\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567522 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzb6\" (UniqueName: \"kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567546 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567582 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-auth-proxy-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567610 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567635 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwpc\" (UniqueName: \"kubernetes.io/projected/c3738927-508b-431d-b0a9-8762d261f7f5-kube-api-access-shwpc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567674 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567690 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567711 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a454f893-33d3-4b64-af77-016e2bef05f2-machine-approver-tls\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567755 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567815 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-metrics-certs\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567836 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3738927-508b-431d-b0a9-8762d261f7f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567856 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprtp\" (UniqueName: \"kubernetes.io/projected/5a29832f-5f50-4993-824c-50afa0213fb8-kube-api-access-mprtp\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567904 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd2m7\" (UniqueName: \"kubernetes.io/projected/5205eeb0-f3c3-460a-8512-0af719bbf18a-kube-api-access-hd2m7\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567926 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24rq\" (UniqueName: \"kubernetes.io/projected/44d375d6-9c45-466a-a423-b87b33bc63dd-kube-api-access-d24rq\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567942 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3738927-508b-431d-b0a9-8762d261f7f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.567997 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0840280-c534-4e58-9095-a87e9acb799a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.568025 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-stats-auth\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.568172 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-auth-proxy-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.568205 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44d375d6-9c45-466a-a423-b87b33bc63dd-service-ca-bundle\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.568714 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a454f893-33d3-4b64-af77-016e2bef05f2-config\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.570271 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-default-certificate\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.570430 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-stats-auth\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.571508 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d375d6-9c45-466a-a423-b87b33bc63dd-metrics-certs\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.571715 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a454f893-33d3-4b64-af77-016e2bef05f2-machine-approver-tls\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.606944 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.628204 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.652604 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.668115 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.688088 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.707505 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.727813 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.748349 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.767933 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.787975 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.807823 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.836022 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.847470 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.867796 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.888097 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.908180 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.927763 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.947847 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.953039 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.968416 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.978196 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-config\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.988306 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 08 21:50:50 crc kubenswrapper[4739]: I1008 21:50:50.991906 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3738927-508b-431d-b0a9-8762d261f7f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.007457 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.028060 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.047562 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.049908 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3738927-508b-431d-b0a9-8762d261f7f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.068510 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.087755 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.107808 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.127453 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.148067 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.167138 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.188104 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.208057 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.228121 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.233258 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0840280-c534-4e58-9095-a87e9acb799a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.247992 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.267831 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.288213 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.307681 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.327841 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.348098 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.368359 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.387833 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.406012 4739 request.go:700] Waited for 1.007481753s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.407671 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.428109 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.448645 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.468504 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.488808 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.501603 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.507980 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.536675 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.545408 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.547652 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567280 4739 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.567341 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567352 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle podName:5a29832f-5f50-4993-824c-50afa0213fb8 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.067334867 +0000 UTC m=+151.892720617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle") pod "service-ca-9c57cc56f-7lngv" (UID: "5a29832f-5f50-4993-824c-50afa0213fb8") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567465 4739 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567507 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume podName:5205eeb0-f3c3-460a-8512-0af719bbf18a nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.067492531 +0000 UTC m=+151.892878281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume") pod "dns-default-pc6pd" (UID: "5205eeb0-f3c3-460a-8512-0af719bbf18a") : failed to sync configmap cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567530 4739 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567558 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key podName:5a29832f-5f50-4993-824c-50afa0213fb8 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.067549922 +0000 UTC m=+151.892935792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key") pod "service-ca-9c57cc56f-7lngv" (UID: "5a29832f-5f50-4993-824c-50afa0213fb8") : failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567571 4739 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.567600 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert podName:fac034e3-015b-4ac2-a698-9af0ac978f30 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.067594664 +0000 UTC m=+151.892980414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert") pod "olm-operator-6b444d44fb-hc74g" (UID: "fac034e3-015b-4ac2-a698-9af0ac978f30") : failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.568453 4739 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.568572 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls podName:5205eeb0-f3c3-460a-8512-0af719bbf18a nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.068541317 +0000 UTC m=+151.893927097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls") pod "dns-default-pc6pd" (UID: "5205eeb0-f3c3-460a-8512-0af719bbf18a") : failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.569588 4739 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: E1008 21:50:51.569676 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert podName:fac034e3-015b-4ac2-a698-9af0ac978f30 nodeName:}" failed. No retries permitted until 2025-10-08 21:50:52.069657085 +0000 UTC m=+151.895042835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert") pod "olm-operator-6b444d44fb-hc74g" (UID: "fac034e3-015b-4ac2-a698-9af0ac978f30") : failed to sync secret cache: timed out waiting for the condition Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.587944 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.608239 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.628042 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.647750 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.667851 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.688608 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.708181 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.727820 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.747692 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.766805 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.766879 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.766921 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.787891 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.808362 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.828571 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.847994 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.867826 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.888328 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.908001 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.927575 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.948745 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.969528 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 08 21:50:51 crc kubenswrapper[4739]: I1008 21:50:51.988625 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.007574 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.047209 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.068602 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084708 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084754 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084790 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084809 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084839 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.084877 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.085657 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5205eeb0-f3c3-460a-8512-0af719bbf18a-config-volume\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.086054 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5a29832f-5f50-4993-824c-50afa0213fb8-signing-cabundle\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.086915 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.088450 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.088742 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5205eeb0-f3c3-460a-8512-0af719bbf18a-metrics-tls\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.089662 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fac034e3-015b-4ac2-a698-9af0ac978f30-srv-cert\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.090351 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5a29832f-5f50-4993-824c-50afa0213fb8-signing-key\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.108308 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.127629 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.147394 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.168279 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.201684 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jxnc\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-kube-api-access-9jxnc\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.220014 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzmdg\" (UniqueName: \"kubernetes.io/projected/ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc-kube-api-access-mzmdg\") pod \"authentication-operator-69f744f599-hm4g4\" (UID: \"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.251318 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff6946a6-cb0c-4be1-ba1d-319a64b6eba3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4mkhn\" (UID: \"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.261675 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnsbp\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-kube-api-access-mnsbp\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.293969 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2q4\" (UniqueName: \"kubernetes.io/projected/ce03f742-2910-4d6c-af9b-97abf28c6fbc-kube-api-access-rz2q4\") pod \"apiserver-76f77b778f-dv7wz\" (UID: \"ce03f742-2910-4d6c-af9b-97abf28c6fbc\") " pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.302596 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4dg\" (UniqueName: \"kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg\") pod \"controller-manager-879f6c89f-dq28h\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.320536 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2155710-7d43-4b2b-b619-cedb531ea612-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nlxfq\" (UID: \"e2155710-7d43-4b2b-b619-cedb531ea612\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.341677 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwl5j\" (UniqueName: \"kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j\") pod \"route-controller-manager-6576b87f9c-8smnx\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.345224 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.364079 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2krm\" (UniqueName: \"kubernetes.io/projected/7a3ae274-744f-43f4-a579-a11a36b3eee7-kube-api-access-m2krm\") pod \"openshift-controller-manager-operator-756b6f6bc6-9pqtj\" (UID: \"7a3ae274-744f-43f4-a579-a11a36b3eee7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.373886 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.383885 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.387941 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7vm\" (UniqueName: \"kubernetes.io/projected/d5a65c5e-2929-4d8d-bebb-67af1702dbd4-kube-api-access-hv7vm\") pod \"apiserver-7bbb656c7d-hsz75\" (UID: \"d5a65c5e-2929-4d8d-bebb-67af1702dbd4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.388407 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.406169 4739 request.go:700] Waited for 1.932340099s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.407751 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.429302 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.429646 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.443772 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.448756 4739 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.457530 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.468485 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.476924 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.487832 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.525345 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kpr\" (UniqueName: \"kubernetes.io/projected/a0840280-c534-4e58-9095-a87e9acb799a-kube-api-access-n2kpr\") pod \"control-plane-machine-set-operator-78cbb6b69f-rb8dx\" (UID: \"a0840280-c534-4e58-9095-a87e9acb799a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.546702 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c03aab2-cb1d-463f-9a2b-2673df3ea8c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-76btc\" (UID: \"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.569740 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngvt\" (UniqueName: \"kubernetes.io/projected/fac034e3-015b-4ac2-a698-9af0ac978f30-kube-api-access-zngvt\") pod \"olm-operator-6b444d44fb-hc74g\" (UID: \"fac034e3-015b-4ac2-a698-9af0ac978f30\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.575094 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dv7wz"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.587824 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7q6\" (UniqueName: \"kubernetes.io/projected/a454f893-33d3-4b64-af77-016e2bef05f2-kube-api-access-xz7q6\") pod \"machine-approver-56656f9798-v296d\" (UID: \"a454f893-33d3-4b64-af77-016e2bef05f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.589191 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.612276 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" Oct 08 21:50:52 crc kubenswrapper[4739]: W1008 21:50:52.612700 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb6a9ed0_3db4_44cf_aa70_1933ae5f7e05.slice/crio-c0d5e19729c154f130360d4cbb454eed8134e0db9e0803b11afa08ac5670f2ec WatchSource:0}: Error finding container c0d5e19729c154f130360d4cbb454eed8134e0db9e0803b11afa08ac5670f2ec: Status 404 returned error can't find the container with id c0d5e19729c154f130360d4cbb454eed8134e0db9e0803b11afa08ac5670f2ec Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.614030 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzb6\" (UniqueName: \"kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6\") pod \"marketplace-operator-79b997595-qpcfx\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.626989 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd2m7\" (UniqueName: \"kubernetes.io/projected/5205eeb0-f3c3-460a-8512-0af719bbf18a-kube-api-access-hd2m7\") pod \"dns-default-pc6pd\" (UID: \"5205eeb0-f3c3-460a-8512-0af719bbf18a\") " pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.627556 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.641189 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.642001 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprtp\" (UniqueName: \"kubernetes.io/projected/5a29832f-5f50-4993-824c-50afa0213fb8-kube-api-access-mprtp\") pod \"service-ca-9c57cc56f-7lngv\" (UID: \"5a29832f-5f50-4993-824c-50afa0213fb8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.653478 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.660096 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwpc\" (UniqueName: \"kubernetes.io/projected/c3738927-508b-431d-b0a9-8762d261f7f5-kube-api-access-shwpc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5zgjg\" (UID: \"c3738927-508b-431d-b0a9-8762d261f7f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.682445 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24rq\" (UniqueName: \"kubernetes.io/projected/44d375d6-9c45-466a-a423-b87b33bc63dd-kube-api-access-d24rq\") pod \"router-default-5444994796-zrm4p\" (UID: \"44d375d6-9c45-466a-a423-b87b33bc63dd\") " pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.689276 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.720741 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.726615 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.734087 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.791948 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8b872b-6b5c-4091-8a35-8f2bd3257243-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.791985 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792002 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/295df770-129a-4aca-a189-3f4d77e97fc9-serving-cert\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792017 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8b872b-6b5c-4091-8a35-8f2bd3257243-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792047 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e25dea3-95bd-4316-baef-f9bf7726b8e7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792205 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27ccq\" (UniqueName: \"kubernetes.io/projected/f6fdf42e-8623-4102-8d38-c22c6c3d6978-kube-api-access-27ccq\") pod \"downloads-7954f5f757-qdvnh\" (UID: \"f6fdf42e-8623-4102-8d38-c22c6c3d6978\") " pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792266 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792290 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-client\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792324 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-trusted-ca\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792350 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vgps\" (UniqueName: \"kubernetes.io/projected/015086a5-5aff-4732-a198-26d0b29d1253-kube-api-access-9vgps\") pod \"migrator-59844c95c7-hmj4f\" (UID: \"015086a5-5aff-4732-a198-26d0b29d1253\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792392 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-config\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792410 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792428 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7prc\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792455 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792493 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792523 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792540 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6r6c\" (UniqueName: \"kubernetes.io/projected/295df770-129a-4aca-a189-3f4d77e97fc9-kube-api-access-d6r6c\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792554 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d29np\" (UniqueName: \"kubernetes.io/projected/26f8967c-81df-4899-b6d5-825e39644e01-kube-api-access-d29np\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792570 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792584 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed5fea0-a39d-4416-8391-612dd5149de4-proxy-tls\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792607 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792625 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-service-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792644 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltj62\" (UniqueName: \"kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792680 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792711 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4w28\" (UniqueName: \"kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792732 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99d4be11-0ed9-432b-8300-51e96e354634-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792753 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d4be11-0ed9-432b-8300-51e96e354634-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792771 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792792 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-proxy-tls\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792808 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7k7\" (UniqueName: \"kubernetes.io/projected/9d8b872b-6b5c-4091-8a35-8f2bd3257243-kube-api-access-ml7k7\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792836 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9xf\" (UniqueName: \"kubernetes.io/projected/c8c70167-b995-419b-afbe-a58dd78ebe48-kube-api-access-fk9xf\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792852 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792869 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792897 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949d8\" (UniqueName: \"kubernetes.io/projected/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-kube-api-access-949d8\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792913 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792931 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sft2k\" (UniqueName: \"kubernetes.io/projected/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-kube-api-access-sft2k\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792945 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792963 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8nv\" (UniqueName: \"kubernetes.io/projected/9cc05cba-c864-4438-9740-d9be599131d7-kube-api-access-jx8nv\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792979 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.792998 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e25dea3-95bd-4316-baef-f9bf7726b8e7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793020 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793076 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-srv-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793090 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfr4f\" (UniqueName: \"kubernetes.io/projected/602cffd4-e6ff-442c-80e1-2122748972d1-kube-api-access-sfr4f\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793113 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8967c-81df-4899-b6d5-825e39644e01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793165 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793191 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-serving-cert\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793209 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793225 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793254 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793269 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs9vj\" (UniqueName: \"kubernetes.io/projected/1ed5fea0-a39d-4416-8391-612dd5149de4-kube-api-access-qs9vj\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793283 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793298 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793325 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793340 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793354 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793371 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-config\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793389 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7pg\" (UniqueName: \"kubernetes.io/projected/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-kube-api-access-4b7pg\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793410 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-images\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793431 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793449 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-serving-cert\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793463 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793479 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793515 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/222ec19b-ce6c-4fe1-af9b-b5e297296171-cert\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793529 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d4be11-0ed9-432b-8300-51e96e354634-config\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793544 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-config\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793559 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793573 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmscr\" (UniqueName: \"kubernetes.io/projected/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-kube-api-access-tmscr\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793601 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7msl7\" (UniqueName: \"kubernetes.io/projected/222ec19b-ce6c-4fe1-af9b-b5e297296171-kube-api-access-7msl7\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793620 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-metrics-tls\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793675 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793695 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9cc05cba-c864-4438-9740-d9be599131d7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793714 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793733 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793749 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793764 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.793779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lljws\" (UniqueName: \"kubernetes.io/projected/0e25dea3-95bd-4316-baef-f9bf7726b8e7-kube-api-access-lljws\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: E1008 21:50:52.794437 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.294425406 +0000 UTC m=+153.119811156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.796614 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.822340 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.846178 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.894264 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:52 crc kubenswrapper[4739]: E1008 21:50:52.894462 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.394429938 +0000 UTC m=+153.219815688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.894509 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.894948 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-certs\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895011 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sft2k\" (UniqueName: \"kubernetes.io/projected/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-kube-api-access-sft2k\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895060 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-registration-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895088 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8nv\" (UniqueName: \"kubernetes.io/projected/9cc05cba-c864-4438-9740-d9be599131d7-kube-api-access-jx8nv\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895129 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895376 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e25dea3-95bd-4316-baef-f9bf7726b8e7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895446 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895477 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895500 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-srv-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895525 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfr4f\" (UniqueName: \"kubernetes.io/projected/602cffd4-e6ff-442c-80e1-2122748972d1-kube-api-access-sfr4f\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895548 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8967c-81df-4899-b6d5-825e39644e01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895572 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-plugins-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895618 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895649 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-serving-cert\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895674 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-config\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895696 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895748 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-mountpoint-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fss4\" (UniqueName: \"kubernetes.io/projected/6752e659-e823-46c0-bf41-a65c51d20b88-kube-api-access-7fss4\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895808 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895832 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-node-bootstrap-token\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895895 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895946 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895967 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs9vj\" (UniqueName: \"kubernetes.io/projected/1ed5fea0-a39d-4416-8391-612dd5149de4-kube-api-access-qs9vj\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.895993 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896018 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896042 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896069 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896088 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp94\" (UniqueName: \"kubernetes.io/projected/9bab544b-7e6a-4781-a7ce-73a6a40fe752-kube-api-access-gwp94\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896131 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-config\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896172 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnkbw\" (UniqueName: \"kubernetes.io/projected/abd1c1de-f12b-48d3-9687-54025a7daa56-kube-api-access-dnkbw\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896199 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7pg\" (UniqueName: \"kubernetes.io/projected/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-kube-api-access-4b7pg\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896221 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-images\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896246 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896270 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnbf\" (UniqueName: \"kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896338 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-serving-cert\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896373 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896411 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896439 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-images\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896474 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/222ec19b-ce6c-4fe1-af9b-b5e297296171-cert\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896504 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d4be11-0ed9-432b-8300-51e96e354634-config\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896531 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-config\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896554 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmscr\" (UniqueName: \"kubernetes.io/projected/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-kube-api-access-tmscr\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896584 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896609 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7msl7\" (UniqueName: \"kubernetes.io/projected/222ec19b-ce6c-4fe1-af9b-b5e297296171-kube-api-access-7msl7\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896633 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-metrics-tls\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896661 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9cc05cba-c864-4438-9740-d9be599131d7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896684 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896709 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896732 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896756 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896775 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lljws\" (UniqueName: \"kubernetes.io/projected/0e25dea3-95bd-4316-baef-f9bf7726b8e7-kube-api-access-lljws\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896803 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896828 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bab544b-7e6a-4781-a7ce-73a6a40fe752-tmpfs\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896857 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8b872b-6b5c-4091-8a35-8f2bd3257243-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896878 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897053 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/295df770-129a-4aca-a189-3f4d77e97fc9-serving-cert\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897076 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8b872b-6b5c-4091-8a35-8f2bd3257243-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897103 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e25dea3-95bd-4316-baef-f9bf7726b8e7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897128 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27ccq\" (UniqueName: \"kubernetes.io/projected/f6fdf42e-8623-4102-8d38-c22c6c3d6978-kube-api-access-27ccq\") pod \"downloads-7954f5f757-qdvnh\" (UID: \"f6fdf42e-8623-4102-8d38-c22c6c3d6978\") " pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897187 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897214 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-client\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897237 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-trusted-ca\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897261 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vgps\" (UniqueName: \"kubernetes.io/projected/015086a5-5aff-4732-a198-26d0b29d1253-kube-api-access-9vgps\") pod \"migrator-59844c95c7-hmj4f\" (UID: \"015086a5-5aff-4732-a198-26d0b29d1253\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897319 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897355 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-config\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897376 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7prc\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897521 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897549 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897573 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897594 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6r6c\" (UniqueName: \"kubernetes.io/projected/295df770-129a-4aca-a189-3f4d77e97fc9-kube-api-access-d6r6c\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897619 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d29np\" (UniqueName: \"kubernetes.io/projected/26f8967c-81df-4899-b6d5-825e39644e01-kube-api-access-d29np\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897646 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897672 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed5fea0-a39d-4416-8391-612dd5149de4-proxy-tls\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897783 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-service-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897803 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltj62\" (UniqueName: \"kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897831 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd1c1de-f12b-48d3-9687-54025a7daa56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897854 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897881 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897900 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.897935 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4w28\" (UniqueName: \"kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898033 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-socket-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898057 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2chs\" (UniqueName: \"kubernetes.io/projected/998da9f7-3775-47d2-a5ab-6848f9ecc779-kube-api-access-k2chs\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898116 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99d4be11-0ed9-432b-8300-51e96e354634-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898176 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d4be11-0ed9-432b-8300-51e96e354634-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898292 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898321 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-webhook-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898388 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-proxy-tls\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898413 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7k7\" (UniqueName: \"kubernetes.io/projected/9d8b872b-6b5c-4091-8a35-8f2bd3257243-kube-api-access-ml7k7\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898442 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9xf\" (UniqueName: \"kubernetes.io/projected/c8c70167-b995-419b-afbe-a58dd78ebe48-kube-api-access-fk9xf\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898467 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898494 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-apiservice-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898565 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898616 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-949d8\" (UniqueName: \"kubernetes.io/projected/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-kube-api-access-949d8\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898639 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-csi-data-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.898677 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.896129 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.900296 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-images\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.900313 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-config\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.900976 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.901158 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.901454 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.902286 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.903907 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.905908 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.907095 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-config\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.908610 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.909058 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d4be11-0ed9-432b-8300-51e96e354634-config\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.909820 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.913526 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.913659 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.915288 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-metrics-tls\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.916882 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.916886 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-service-ca\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.917323 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ed5fea0-a39d-4416-8391-612dd5149de4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: E1008 21:50:52.917346 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.417326712 +0000 UTC m=+153.242712462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.917727 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.918047 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.918102 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295df770-129a-4aca-a189-3f4d77e97fc9-trusted-ca\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.918319 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8967c-81df-4899-b6d5-825e39644e01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.918389 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e25dea3-95bd-4316-baef-f9bf7726b8e7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.919057 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-config\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.920084 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.920615 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.920936 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.920997 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-serving-cert\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.921924 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.922840 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.923521 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.923768 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8b872b-6b5c-4091-8a35-8f2bd3257243-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.923891 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e25dea3-95bd-4316-baef-f9bf7726b8e7-serving-cert\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.924698 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.925679 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.926480 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/222ec19b-ce6c-4fe1-af9b-b5e297296171-cert\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.927087 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/295df770-129a-4aca-a189-3f4d77e97fc9-serving-cert\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.927478 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.928230 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-srv-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.928433 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-serving-cert\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.928522 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.928942 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.930342 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.946064 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.950040 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8c70167-b995-419b-afbe-a58dd78ebe48-etcd-client\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.950306 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.950677 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sft2k\" (UniqueName: \"kubernetes.io/projected/b07e93ff-e5b0-4dba-b705-3b50c5f977f0-kube-api-access-sft2k\") pod \"service-ca-operator-777779d784-b6gdx\" (UID: \"b07e93ff-e5b0-4dba-b705-3b50c5f977f0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.951081 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/602cffd4-e6ff-442c-80e1-2122748972d1-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.951656 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-proxy-tls\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.952240 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm4g4"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.952477 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ed5fea0-a39d-4416-8391-612dd5149de4-proxy-tls\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.953434 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj"] Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.954610 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.955948 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9cc05cba-c864-4438-9740-d9be599131d7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.956068 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.956395 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.957948 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.957961 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.958651 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d4be11-0ed9-432b-8300-51e96e354634-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.959896 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8b872b-6b5c-4091-8a35-8f2bd3257243-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.964195 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7pg\" (UniqueName: \"kubernetes.io/projected/1d2e6b08-794d-4c6e-8dd5-1197ce2175a2-kube-api-access-4b7pg\") pod \"dns-operator-744455d44c-lt4w6\" (UID: \"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2\") " pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.987178 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs9vj\" (UniqueName: \"kubernetes.io/projected/1ed5fea0-a39d-4416-8391-612dd5149de4-kube-api-access-qs9vj\") pod \"machine-config-operator-74547568cd-kw88q\" (UID: \"1ed5fea0-a39d-4416-8391-612dd5149de4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.995768 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" Oct 08 21:50:52 crc kubenswrapper[4739]: W1008 21:50:52.996331 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c03aab2_cb1d_463f_9a2b_2673df3ea8c4.slice/crio-adbc03075c2f44e0792f281b178d5255ea9269a5e87267109d236e28b37bbc79 WatchSource:0}: Error finding container adbc03075c2f44e0792f281b178d5255ea9269a5e87267109d236e28b37bbc79: Status 404 returned error can't find the container with id adbc03075c2f44e0792f281b178d5255ea9269a5e87267109d236e28b37bbc79 Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999323 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:52 crc kubenswrapper[4739]: E1008 21:50:52.999548 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.499521838 +0000 UTC m=+153.324907588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999649 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd1c1de-f12b-48d3-9687-54025a7daa56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999672 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999697 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-socket-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999716 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2chs\" (UniqueName: \"kubernetes.io/projected/998da9f7-3775-47d2-a5ab-6848f9ecc779-kube-api-access-k2chs\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999754 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-webhook-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999785 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-apiservice-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:52 crc kubenswrapper[4739]: I1008 21:50:52.999811 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-csi-data-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:52.999848 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-certs\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:52.999867 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-registration-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:52.999898 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.000060 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-plugins-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.000094 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-config\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.000109 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-mountpoint-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.000124 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fss4\" (UniqueName: \"kubernetes.io/projected/6752e659-e823-46c0-bf41-a65c51d20b88-kube-api-access-7fss4\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.000956 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-plugins-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.001018 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-registration-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.001700 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.001940 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-mountpoint-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002073 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-socket-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002088 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-node-bootstrap-token\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002152 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp94\" (UniqueName: \"kubernetes.io/projected/9bab544b-7e6a-4781-a7ce-73a6a40fe752-kube-api-access-gwp94\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002173 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnkbw\" (UniqueName: \"kubernetes.io/projected/abd1c1de-f12b-48d3-9687-54025a7daa56-kube-api-access-dnkbw\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002198 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnbf\" (UniqueName: \"kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002253 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-images\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.002453 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/998da9f7-3775-47d2-a5ab-6848f9ecc779-csi-data-dir\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.005458 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-config\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.005969 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/abd1c1de-f12b-48d3-9687-54025a7daa56-images\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.006261 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.011901 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.014309 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6946a6_cb0c_4be1_ba1d_319a64b6eba3.slice/crio-e5e7f332d24e08fd421a579a7f0d4001db82db08f7a3f4151207054d1413816a WatchSource:0}: Error finding container e5e7f332d24e08fd421a579a7f0d4001db82db08f7a3f4151207054d1413816a: Status 404 returned error can't find the container with id e5e7f332d24e08fd421a579a7f0d4001db82db08f7a3f4151207054d1413816a Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.016049 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bab544b-7e6a-4781-a7ce-73a6a40fe752-tmpfs\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.016116 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.016418 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bab544b-7e6a-4781-a7ce-73a6a40fe752-tmpfs\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.016492 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.516476093 +0000 UTC m=+153.341861843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.018071 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd1c1de-f12b-48d3-9687-54025a7daa56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.025173 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-node-bootstrap-token\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.026571 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfr4f\" (UniqueName: \"kubernetes.io/projected/602cffd4-e6ff-442c-80e1-2122748972d1-kube-api-access-sfr4f\") pod \"catalog-operator-68c6474976-dmrdt\" (UID: \"602cffd4-e6ff-442c-80e1-2122748972d1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.027329 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8nv\" (UniqueName: \"kubernetes.io/projected/9cc05cba-c864-4438-9740-d9be599131d7-kube-api-access-jx8nv\") pod \"cluster-samples-operator-665b6dd947-tqmgf\" (UID: \"9cc05cba-c864-4438-9740-d9be599131d7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.027834 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6752e659-e823-46c0-bf41-a65c51d20b88-certs\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.028435 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-webhook-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.028855 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bab544b-7e6a-4781-a7ce-73a6a40fe752-apiservice-cert\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.044955 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.054059 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.074898 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6r6c\" (UniqueName: \"kubernetes.io/projected/295df770-129a-4aca-a189-3f4d77e97fc9-kube-api-access-d6r6c\") pod \"console-operator-58897d9998-884f6\" (UID: \"295df770-129a-4aca-a189-3f4d77e97fc9\") " pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.081444 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmscr\" (UniqueName: \"kubernetes.io/projected/6eab7118-dbb8-4fae-a999-f50fe9cafd3d-kube-api-access-tmscr\") pod \"machine-config-controller-84d6567774-m47gp\" (UID: \"6eab7118-dbb8-4fae-a999-f50fe9cafd3d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.108467 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41b4cb00-53d9-4481-9d7c-3368a9d8d9f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mgssn\" (UID: \"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.117467 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.117513 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.117979 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.617956922 +0000 UTC m=+153.443342672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.135556 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d29np\" (UniqueName: \"kubernetes.io/projected/26f8967c-81df-4899-b6d5-825e39644e01-kube-api-access-d29np\") pod \"package-server-manager-789f6589d5-bqplh\" (UID: \"26f8967c-81df-4899-b6d5-825e39644e01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.137896 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.149135 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27ccq\" (UniqueName: \"kubernetes.io/projected/f6fdf42e-8623-4102-8d38-c22c6c3d6978-kube-api-access-27ccq\") pod \"downloads-7954f5f757-qdvnh\" (UID: \"f6fdf42e-8623-4102-8d38-c22c6c3d6978\") " pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.164184 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pc6pd"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.172876 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7msl7\" (UniqueName: \"kubernetes.io/projected/222ec19b-ce6c-4fe1-af9b-b5e297296171-kube-api-access-7msl7\") pod \"ingress-canary-dcmnq\" (UID: \"222ec19b-ce6c-4fe1-af9b-b5e297296171\") " pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.188059 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-949d8\" (UniqueName: \"kubernetes.io/projected/c88feee0-f87d-4234-a9f7-15bd3e2c8d0f-kube-api-access-949d8\") pod \"multus-admission-controller-857f4d67dd-rkgxg\" (UID: \"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.190941 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.194723 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.198450 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.210045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltj62\" (UniqueName: \"kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62\") pod \"oauth-openshift-558db77b4-7fzvv\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.219126 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.219532 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.719518915 +0000 UTC m=+153.544904665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.227575 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.241168 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vgps\" (UniqueName: \"kubernetes.io/projected/015086a5-5aff-4732-a198-26d0b29d1253-kube-api-access-9vgps\") pod \"migrator-59844c95c7-hmj4f\" (UID: \"015086a5-5aff-4732-a198-26d0b29d1253\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.253487 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.253949 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lngv"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.254515 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4w28\" (UniqueName: \"kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28\") pod \"console-f9d7485db-n8lrt\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.260572 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dcmnq" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.273409 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.276714 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99d4be11-0ed9-432b-8300-51e96e354634-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tcvn\" (UID: \"99d4be11-0ed9-432b-8300-51e96e354634\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.293465 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9xf\" (UniqueName: \"kubernetes.io/projected/c8c70167-b995-419b-afbe-a58dd78ebe48-kube-api-access-fk9xf\") pod \"etcd-operator-b45778765-klt8b\" (UID: \"c8c70167-b995-419b-afbe-a58dd78ebe48\") " pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.298755 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.299077 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.309392 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7prc\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.314018 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.320008 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.326103 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a29832f_5f50_4993_824c_50afa0213fb8.slice/crio-fc19dfe59292a895d889c60edd9886b9ae39296dfc81ef81052d7bd4169b0a72 WatchSource:0}: Error finding container fc19dfe59292a895d889c60edd9886b9ae39296dfc81ef81052d7bd4169b0a72: Status 404 returned error can't find the container with id fc19dfe59292a895d889c60edd9886b9ae39296dfc81ef81052d7bd4169b0a72 Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.327184 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.827142577 +0000 UTC m=+153.652528367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.329730 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.333789 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.339321 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7k7\" (UniqueName: \"kubernetes.io/projected/9d8b872b-6b5c-4091-8a35-8f2bd3257243-kube-api-access-ml7k7\") pod \"openshift-apiserver-operator-796bbdcf4f-8c2dn\" (UID: \"9d8b872b-6b5c-4091-8a35-8f2bd3257243\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.350376 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.352951 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lljws\" (UniqueName: \"kubernetes.io/projected/0e25dea3-95bd-4316-baef-f9bf7726b8e7-kube-api-access-lljws\") pod \"openshift-config-operator-7777fb866f-kv4v6\" (UID: \"0e25dea3-95bd-4316-baef-f9bf7726b8e7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.389284 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fss4\" (UniqueName: \"kubernetes.io/projected/6752e659-e823-46c0-bf41-a65c51d20b88-kube-api-access-7fss4\") pod \"machine-config-server-bdt5r\" (UID: \"6752e659-e823-46c0-bf41-a65c51d20b88\") " pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.389635 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.402801 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3738927_508b_431d_b0a9_8762d261f7f5.slice/crio-b84e5983e46d3df46a4bfbc0fd1ce7c191e082aca326119c376255778f460093 WatchSource:0}: Error finding container b84e5983e46d3df46a4bfbc0fd1ce7c191e082aca326119c376255778f460093: Status 404 returned error can't find the container with id b84e5983e46d3df46a4bfbc0fd1ce7c191e082aca326119c376255778f460093 Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.403271 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.423007 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.423839 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:53.923819517 +0000 UTC m=+153.749205267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.425108 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2chs\" (UniqueName: \"kubernetes.io/projected/998da9f7-3775-47d2-a5ab-6848f9ecc779-kube-api-access-k2chs\") pod \"csi-hostpathplugin-bdrcp\" (UID: \"998da9f7-3775-47d2-a5ab-6848f9ecc779\") " pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.429076 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.429207 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnkbw\" (UniqueName: \"kubernetes.io/projected/abd1c1de-f12b-48d3-9687-54025a7daa56-kube-api-access-dnkbw\") pod \"machine-api-operator-5694c8668f-8679r\" (UID: \"abd1c1de-f12b-48d3-9687-54025a7daa56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.444659 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.445401 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp94\" (UniqueName: \"kubernetes.io/projected/9bab544b-7e6a-4781-a7ce-73a6a40fe752-kube-api-access-gwp94\") pod \"packageserver-d55dfcdfc-8fsdr\" (UID: \"9bab544b-7e6a-4781-a7ce-73a6a40fe752\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.449543 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb07e93ff_e5b0_4dba_b705_3b50c5f977f0.slice/crio-5f1e10384b628190a5f272fb7f2e122ab46e637f685807eee64477c3426c270c WatchSource:0}: Error finding container 5f1e10384b628190a5f272fb7f2e122ab46e637f685807eee64477c3426c270c: Status 404 returned error can't find the container with id 5f1e10384b628190a5f272fb7f2e122ab46e637f685807eee64477c3426c270c Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.455357 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.463543 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.464542 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnbf\" (UniqueName: \"kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf\") pod \"collect-profiles-29332665-fv8l4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.477471 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac034e3_015b_4ac2_a698_9af0ac978f30.slice/crio-dbcf731200cb78575f50873d975e110915f4b39f40e476a0816cedb262297c5d WatchSource:0}: Error finding container dbcf731200cb78575f50873d975e110915f4b39f40e476a0816cedb262297c5d: Status 404 returned error can't find the container with id dbcf731200cb78575f50873d975e110915f4b39f40e476a0816cedb262297c5d Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.506464 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.526486 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.526951 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.026931177 +0000 UTC m=+153.852316927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: W1008 21:50:53.529888 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602cffd4_e6ff_442c_80e1_2122748972d1.slice/crio-ea1f71afe63ef75eac83bd08fb27eed3b674a0608ebd5ba4ba74c7d180d7ea9e WatchSource:0}: Error finding container ea1f71afe63ef75eac83bd08fb27eed3b674a0608ebd5ba4ba74c7d180d7ea9e: Status 404 returned error can't find the container with id ea1f71afe63ef75eac83bd08fb27eed3b674a0608ebd5ba4ba74c7d180d7ea9e Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.534578 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.581406 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lt4w6"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.589087 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" event={"ID":"e2155710-7d43-4b2b-b619-cedb531ea612","Type":"ContainerStarted","Data":"c21d8082c5c087fa4e8aaeda3c47492d2daa18b8d5bf35712b0c41bc4f80ab5e"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.589159 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" event={"ID":"e2155710-7d43-4b2b-b619-cedb531ea612","Type":"ContainerStarted","Data":"a3ffc7da90052d8aa229c3ed9eea494fb93026627367c3e2b375ca088e16275a"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.597469 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" event={"ID":"c3738927-508b-431d-b0a9-8762d261f7f5","Type":"ContainerStarted","Data":"b84e5983e46d3df46a4bfbc0fd1ce7c191e082aca326119c376255778f460093"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.598647 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" event={"ID":"1ed5fea0-a39d-4416-8391-612dd5149de4","Type":"ContainerStarted","Data":"fd9914ada07905128b09b536fcb365ddc1b8246407a8541dc63d6bff6388735d"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.599564 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" event={"ID":"a0840280-c534-4e58-9095-a87e9acb799a","Type":"ContainerStarted","Data":"549986f4f7d5e6ec575d9029738bafdf46c6240142a7b583616be53a626a215e"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.599595 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" event={"ID":"a0840280-c534-4e58-9095-a87e9acb799a","Type":"ContainerStarted","Data":"0eaf83b6f596a22cdfd71b45d86a5e45e2da4a6868d94cddeec46b6b2222a39d"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.600661 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" event={"ID":"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3","Type":"ContainerStarted","Data":"642571a942e9859b12eba6af8ad907ca55ed666dd31362d02d4d243cac8c43de"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.600697 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" event={"ID":"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3","Type":"ContainerStarted","Data":"e5e7f332d24e08fd421a579a7f0d4001db82db08f7a3f4151207054d1413816a"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.601936 4739 generic.go:334] "Generic (PLEG): container finished" podID="ce03f742-2910-4d6c-af9b-97abf28c6fbc" containerID="4957185e152e4a2c5176b295c1081af4fa5b5a6756eca2fbeec4fa563361dd22" exitCode=0 Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.602536 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" event={"ID":"ce03f742-2910-4d6c-af9b-97abf28c6fbc","Type":"ContainerDied","Data":"4957185e152e4a2c5176b295c1081af4fa5b5a6756eca2fbeec4fa563361dd22"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.602631 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" event={"ID":"ce03f742-2910-4d6c-af9b-97abf28c6fbc","Type":"ContainerStarted","Data":"a660e2120b29900c036a9e13c2039d7107f4e7f30b567ddb3173227dd13a3a73"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.606519 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" event={"ID":"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05","Type":"ContainerStarted","Data":"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.606559 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" event={"ID":"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05","Type":"ContainerStarted","Data":"c0d5e19729c154f130360d4cbb454eed8134e0db9e0803b11afa08ac5670f2ec"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.610377 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.612561 4739 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dq28h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.612620 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.618390 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.627329 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zrm4p" event={"ID":"44d375d6-9c45-466a-a423-b87b33bc63dd","Type":"ContainerStarted","Data":"2fe2e352ba3ff9b5ddda68eaf9519897d01a6400d1d0c1d5c57e0ab3e3b1703a"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.627379 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zrm4p" event={"ID":"44d375d6-9c45-466a-a423-b87b33bc63dd","Type":"ContainerStarted","Data":"584fd099a33943dcbe7c859b9134563264c71658e9bad8ca07f857779c2e99b5"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.627500 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.627789 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.127777661 +0000 UTC m=+153.953163411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.632632 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" event={"ID":"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4","Type":"ContainerStarted","Data":"adbc03075c2f44e0792f281b178d5255ea9269a5e87267109d236e28b37bbc79"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.636178 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" event={"ID":"d5a65c5e-2929-4d8d-bebb-67af1702dbd4","Type":"ContainerStarted","Data":"ca0ffd42a74e26d0af7ab9520bda7dd6ef7d86715d37372e21702c1accfc43c9"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.638301 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" event={"ID":"602cffd4-e6ff-442c-80e1-2122748972d1","Type":"ContainerStarted","Data":"ea1f71afe63ef75eac83bd08fb27eed3b674a0608ebd5ba4ba74c7d180d7ea9e"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.642599 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" event={"ID":"34b38b7a-4e93-49f1-907e-24fc371f31e3","Type":"ContainerStarted","Data":"a9b65154ff43aee50ad40a1d63adf2dd0af8e79c608e28337502eae70d346afc"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.646158 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" event={"ID":"fac034e3-015b-4ac2-a698-9af0ac978f30","Type":"ContainerStarted","Data":"dbcf731200cb78575f50873d975e110915f4b39f40e476a0816cedb262297c5d"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.656840 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pc6pd" event={"ID":"5205eeb0-f3c3-460a-8512-0af719bbf18a","Type":"ContainerStarted","Data":"7577889e1696f223f2e8aa2e1a9ea9b4a8eeb6bd05977b5759389e5d916dc569"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.663015 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" event={"ID":"a454f893-33d3-4b64-af77-016e2bef05f2","Type":"ContainerStarted","Data":"e17b903e38265342c357210d7fdd411faacb105e9c159144848da664b06b05f3"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.663068 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" event={"ID":"a454f893-33d3-4b64-af77-016e2bef05f2","Type":"ContainerStarted","Data":"76636d6cf4a7d81150bc28b8fb1cd424deae16b4be9a9d348cd4e2a90ec53588"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.665966 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" event={"ID":"b07e93ff-e5b0-4dba-b705-3b50c5f977f0","Type":"ContainerStarted","Data":"5f1e10384b628190a5f272fb7f2e122ab46e637f685807eee64477c3426c270c"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.667384 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.669761 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" event={"ID":"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc","Type":"ContainerStarted","Data":"f7df0bd3497dce5a409594a0cc2530df1a46c3b0c9acf84990154af994f1ef44"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.669823 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" event={"ID":"ccd3217c-fe1b-4b65-9f5f-bc1e717c06fc","Type":"ContainerStarted","Data":"e0804afcb81be336631648f21468d7ad37924107d553bf025dbd7ba050ccc7e2"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.671677 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.680729 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.680989 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" event={"ID":"a2b52465-5f11-4296-87b3-9254f036358f","Type":"ContainerStarted","Data":"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.681028 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" event={"ID":"a2b52465-5f11-4296-87b3-9254f036358f","Type":"ContainerStarted","Data":"5244171ab73ceca288d5da2f86389cf9bac32400a26ced68b725acc69ec01129"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.681401 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.684348 4739 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8smnx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.684405 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" podUID="a2b52465-5f11-4296-87b3-9254f036358f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.694483 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" event={"ID":"5a29832f-5f50-4993-824c-50afa0213fb8","Type":"ContainerStarted","Data":"fc19dfe59292a895d889c60edd9886b9ae39296dfc81ef81052d7bd4169b0a72"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.694507 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bdt5r" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.700854 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" event={"ID":"7a3ae274-744f-43f4-a579-a11a36b3eee7","Type":"ContainerStarted","Data":"057184b7fbad14b71b51c7ccf85176c316968c656df90515ad84f4c457cbfac4"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.700912 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" event={"ID":"7a3ae274-744f-43f4-a579-a11a36b3eee7","Type":"ContainerStarted","Data":"778182be57e5a0ff2441fb7741575fb2c4fd19c54b87921008ab653eb59fb979"} Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.710918 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.728853 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.730335 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.230090871 +0000 UTC m=+154.055476801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.830631 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.832768 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.832916 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.832833 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.332818343 +0000 UTC m=+154.158204093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.846776 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.922992 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf"] Oct 08 21:50:53 crc kubenswrapper[4739]: I1008 21:50:53.934423 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:53 crc kubenswrapper[4739]: E1008 21:50:53.935065 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.435046871 +0000 UTC m=+154.260432621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.040176 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.041333 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.541316871 +0000 UTC m=+154.366702621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.079358 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.143245 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.143637 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.64358557 +0000 UTC m=+154.468971320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.143789 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.144058 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.644045861 +0000 UTC m=+154.469431611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.164864 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dcmnq"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.172929 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.235915 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.245865 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.246167 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.746134216 +0000 UTC m=+154.571519956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.258524 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rkgxg"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.260088 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-884f6"] Oct 08 21:50:54 crc kubenswrapper[4739]: W1008 21:50:54.344039 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f8967c_81df_4899_b6d5_825e39644e01.slice/crio-e0c838c2d338af7cf698b068c21f0bab5472d3d41e70cc875d22775c5650bd43 WatchSource:0}: Error finding container e0c838c2d338af7cf698b068c21f0bab5472d3d41e70cc875d22775c5650bd43: Status 404 returned error can't find the container with id e0c838c2d338af7cf698b068c21f0bab5472d3d41e70cc875d22775c5650bd43 Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.347168 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.347506 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.847488932 +0000 UTC m=+154.672874682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.449125 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.449707 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:54.94968868 +0000 UTC m=+154.775074430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.552117 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.552437 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.052422532 +0000 UTC m=+154.877808282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.567375 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6"] Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.657416 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.657876 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.157834389 +0000 UTC m=+154.983220139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: W1008 21:50:54.726852 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e25dea3_95bd_4316_baef_f9bf7726b8e7.slice/crio-8bcf6d1c5ae85571e80ce7471aadada724a97f9823ee98ecbf8a826e7ce23e83 WatchSource:0}: Error finding container 8bcf6d1c5ae85571e80ce7471aadada724a97f9823ee98ecbf8a826e7ce23e83: Status 404 returned error can't find the container with id 8bcf6d1c5ae85571e80ce7471aadada724a97f9823ee98ecbf8a826e7ce23e83 Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.776457 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.788092 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.288078079 +0000 UTC m=+155.113463829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.808527 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" event={"ID":"34b38b7a-4e93-49f1-907e-24fc371f31e3","Type":"ContainerStarted","Data":"f814ffedd05fab2b0a3f0c07cf59108608700a3fabc88a1dd68f3f7e1d644c6c"} Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.809118 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.819195 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dcmnq" event={"ID":"222ec19b-ce6c-4fe1-af9b-b5e297296171","Type":"ContainerStarted","Data":"85c6b32d1ce99da30e42213baebb1985ea0e20d1a61c8e5bec925c7b8c78c042"} Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.849215 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:54 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:54 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:54 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.849262 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.849465 4739 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qpcfx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.849480 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.885255 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.886629 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.386610574 +0000 UTC m=+155.211996324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.968980 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" event={"ID":"0c03aab2-cb1d-463f-9a2b-2673df3ea8c4","Type":"ContainerStarted","Data":"61a4e28ae14a1c5383e31283183c6f46b97667e6680f85e5d7faa1c13594ee6e"} Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.986475 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:54 crc kubenswrapper[4739]: E1008 21:50:54.986763 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.486750561 +0000 UTC m=+155.312136311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:54 crc kubenswrapper[4739]: I1008 21:50:54.990040 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" event={"ID":"6eab7118-dbb8-4fae-a999-f50fe9cafd3d","Type":"ContainerStarted","Data":"87a7b007bc2436ac1a05079ba28d072cf7de60914f702cacdb7a6a2a4ed7e5dd"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.003860 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.007848 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" event={"ID":"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3","Type":"ContainerStarted","Data":"8dab44a0cf6c707e1947f07e19a405e0021ed1ace60a9f7f7fd1ac0c501e74b7"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.056577 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pc6pd" event={"ID":"5205eeb0-f3c3-460a-8512-0af719bbf18a","Type":"ContainerStarted","Data":"4ce9792906ed435106996f738bca664dc2306318bc7d6a86df0bced909502ce3"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.067730 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" event={"ID":"015086a5-5aff-4732-a198-26d0b29d1253","Type":"ContainerStarted","Data":"f8b205418078d2c39f69c18c3eaf2c3c83ac243af4fb3fc382a49ff895cd5ef7"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.087128 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.087976 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" event={"ID":"b07e93ff-e5b0-4dba-b705-3b50c5f977f0","Type":"ContainerStarted","Data":"11a865d3ae28ec9da398a9f6cdbc63aae9427179dbf331a79a99e15858dee875"} Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.088292 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.588276901 +0000 UTC m=+155.413662651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.088789 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" podStartSLOduration=128.088780655 podStartE2EDuration="2m8.088780655s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.042133917 +0000 UTC m=+154.867519667" watchObservedRunningTime="2025-10-08 21:50:55.088780655 +0000 UTC m=+154.914166405" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.088977 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm4g4" podStartSLOduration=129.088974019 podStartE2EDuration="2m9.088974019s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.088370814 +0000 UTC m=+154.913756564" watchObservedRunningTime="2025-10-08 21:50:55.088974019 +0000 UTC m=+154.914359769" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.098253 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" event={"ID":"5a29832f-5f50-4993-824c-50afa0213fb8","Type":"ContainerStarted","Data":"5faa0a717d667288182f45e93f23d611c1b434817ca43063d03667dd473a6e19"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.106171 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.111692 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" event={"ID":"1ed5fea0-a39d-4416-8391-612dd5149de4","Type":"ContainerStarted","Data":"ff62e0848193810fd10e50cceb426b0ab51c2ada475a073d7b7c2b0d5a708b79"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.112783 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9pqtj" podStartSLOduration=128.112766425 podStartE2EDuration="2m8.112766425s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.111936134 +0000 UTC m=+154.937321874" watchObservedRunningTime="2025-10-08 21:50:55.112766425 +0000 UTC m=+154.938152175" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.130012 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-klt8b"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.136184 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" event={"ID":"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2","Type":"ContainerStarted","Data":"e99228e9de53f9333dec515b670263952319d96c03ceb48742c83c3787b9d79e"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.138540 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.172348 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zrm4p" podStartSLOduration=128.172331055 podStartE2EDuration="2m8.172331055s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.171004752 +0000 UTC m=+154.996390502" watchObservedRunningTime="2025-10-08 21:50:55.172331055 +0000 UTC m=+154.997716805" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.223828 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.224100 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.72408458 +0000 UTC m=+155.549470330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.227320 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8679r"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.236424 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qdvnh"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.271421 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.286792 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.295406 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" podStartSLOduration=128.295363234 podStartE2EDuration="2m8.295363234s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.262752778 +0000 UTC m=+155.088138528" watchObservedRunningTime="2025-10-08 21:50:55.295363234 +0000 UTC m=+155.120748994" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.329210 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" event={"ID":"9cc05cba-c864-4438-9740-d9be599131d7","Type":"ContainerStarted","Data":"c27fbe1f54bad8064145cf34da4cd5e4e7a95d6700ed55d5454feac2a06b5b89"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.332454 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.337081 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.339067 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.839046197 +0000 UTC m=+155.664431947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: W1008 21:50:55.341634 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6fdf42e_8623_4102_8d38_c22c6c3d6978.slice/crio-55c8e7e1e7b63cacbe4fe63f2fae768f9bd9a7c5587fbc2caaac5f779e8be48a WatchSource:0}: Error finding container 55c8e7e1e7b63cacbe4fe63f2fae768f9bd9a7c5587fbc2caaac5f779e8be48a: Status 404 returned error can't find the container with id 55c8e7e1e7b63cacbe4fe63f2fae768f9bd9a7c5587fbc2caaac5f779e8be48a Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.354311 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nlxfq" podStartSLOduration=128.354290349 podStartE2EDuration="2m8.354290349s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.329641662 +0000 UTC m=+155.155027412" watchObservedRunningTime="2025-10-08 21:50:55.354290349 +0000 UTC m=+155.179676099" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.389486 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" event={"ID":"a454f893-33d3-4b64-af77-016e2bef05f2","Type":"ContainerStarted","Data":"85a437e9fde7d6961bacd00a2f5200aba905e65b16e729b295c8081736b77989"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.399138 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-884f6" event={"ID":"295df770-129a-4aca-a189-3f4d77e97fc9","Type":"ContainerStarted","Data":"a47e62f78a8ecff55bd53a38c98ebbaa0789c8b62a57a4f9164af8a182b5452a"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.419661 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" event={"ID":"26f8967c-81df-4899-b6d5-825e39644e01","Type":"ContainerStarted","Data":"e0c838c2d338af7cf698b068c21f0bab5472d3d41e70cc875d22775c5650bd43"} Oct 08 21:50:55 crc kubenswrapper[4739]: W1008 21:50:55.419809 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbfbbfca_3095_4a7b_869e_70b1a86046c4.slice/crio-a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8 WatchSource:0}: Error finding container a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8: Status 404 returned error can't find the container with id a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8 Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.428722 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" podStartSLOduration=128.428703881 podStartE2EDuration="2m8.428703881s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.420791534 +0000 UTC m=+155.246177284" watchObservedRunningTime="2025-10-08 21:50:55.428703881 +0000 UTC m=+155.254089631" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.429029 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bdrcp"] Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.438953 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.439284 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:55.939272716 +0000 UTC m=+155.764658466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.452115 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" event={"ID":"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f","Type":"ContainerStarted","Data":"d3e087c27a0ddc4215ec8be5fa718cfccd2df18bec041d3b0ece33e7dee78326"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.496180 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" event={"ID":"fac034e3-015b-4ac2-a698-9af0ac978f30","Type":"ContainerStarted","Data":"28196b7f3c82a158331984ed63b73af29602c28a64c9e9d77edcf695c40c58d8"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.498832 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.510893 4739 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hc74g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.510969 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" podUID="fac034e3-015b-4ac2-a698-9af0ac978f30" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.512852 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rb8dx" podStartSLOduration=128.512835907 podStartE2EDuration="2m8.512835907s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.461461791 +0000 UTC m=+155.286847541" watchObservedRunningTime="2025-10-08 21:50:55.512835907 +0000 UTC m=+155.338221657" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.539456 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.539678 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.039653098 +0000 UTC m=+155.865038848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.539817 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.540458 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.040445377 +0000 UTC m=+155.865831127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.550470 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bdt5r" event={"ID":"6752e659-e823-46c0-bf41-a65c51d20b88","Type":"ContainerStarted","Data":"006f5141877ec834aa967a05013238e9ab44592841abcd6081a9952beadc9d98"} Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.562116 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.562189 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.605792 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v296d" podStartSLOduration=129.605751922 podStartE2EDuration="2m9.605751922s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.595164587 +0000 UTC m=+155.420550347" watchObservedRunningTime="2025-10-08 21:50:55.605751922 +0000 UTC m=+155.431137662" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.629076 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b6gdx" podStartSLOduration=128.629058026 podStartE2EDuration="2m8.629058026s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.628371638 +0000 UTC m=+155.453757388" watchObservedRunningTime="2025-10-08 21:50:55.629058026 +0000 UTC m=+155.454443776" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.645824 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.647639 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.14762034 +0000 UTC m=+155.973006080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.663336 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7lngv" podStartSLOduration=128.663313172 podStartE2EDuration="2m8.663313172s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.660546473 +0000 UTC m=+155.485932223" watchObservedRunningTime="2025-10-08 21:50:55.663313172 +0000 UTC m=+155.488698922" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.754651 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.756323 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" podStartSLOduration=128.756306609 podStartE2EDuration="2m8.756306609s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.753735496 +0000 UTC m=+155.579121246" watchObservedRunningTime="2025-10-08 21:50:55.756306609 +0000 UTC m=+155.581692359" Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.757597 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.257585371 +0000 UTC m=+156.082971121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.827637 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-76btc" podStartSLOduration=128.827623834 podStartE2EDuration="2m8.827623834s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.826521077 +0000 UTC m=+155.651906827" watchObservedRunningTime="2025-10-08 21:50:55.827623834 +0000 UTC m=+155.653009584" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.858648 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.859006 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.35898916 +0000 UTC m=+156.184374910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.885634 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:55 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:55 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:55 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.886242 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.904336 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" podStartSLOduration=128.904306263 podStartE2EDuration="2m8.904306263s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:55.866995649 +0000 UTC m=+155.692381399" watchObservedRunningTime="2025-10-08 21:50:55.904306263 +0000 UTC m=+155.729692013" Oct 08 21:50:55 crc kubenswrapper[4739]: I1008 21:50:55.960554 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:55 crc kubenswrapper[4739]: E1008 21:50:55.960958 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.460939801 +0000 UTC m=+156.286325551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.068516 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.068848 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.568832721 +0000 UTC m=+156.394218471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.170233 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.170655 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.670639298 +0000 UTC m=+156.496025038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.272379 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.272661 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.772646831 +0000 UTC m=+156.598032581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.373234 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.373713 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.87370141 +0000 UTC m=+156.699087160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.474739 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.475004 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:56.974988856 +0000 UTC m=+156.800374606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.577479 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.577863 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.077844199 +0000 UTC m=+156.903229949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.593185 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dcmnq" event={"ID":"222ec19b-ce6c-4fe1-af9b-b5e297296171","Type":"ContainerStarted","Data":"d24122beac0627b71ac063d11b5a62ad77bfed8b12fbb9777c05bb58df3b7fba"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.597346 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" event={"ID":"99d4be11-0ed9-432b-8300-51e96e354634","Type":"ContainerStarted","Data":"3123ee919f8d7d9f5714f0c64ad351d5ff70f3e31d92f5950d4f9c06ea4bd84c"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.617942 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dcmnq" podStartSLOduration=6.617926193 podStartE2EDuration="6.617926193s" podCreationTimestamp="2025-10-08 21:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.617623875 +0000 UTC m=+156.443009625" watchObservedRunningTime="2025-10-08 21:50:56.617926193 +0000 UTC m=+156.443311943" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.620747 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" event={"ID":"1ed5fea0-a39d-4416-8391-612dd5149de4","Type":"ContainerStarted","Data":"b92ef334d636adc9e7dae45e06ddc61f0bd1be5a56af6a0f6a5c1078dc542732"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.627995 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" event={"ID":"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f","Type":"ContainerStarted","Data":"11d2534bd90b8c1e9ea4452639e764118c7d1bb2e4cc990e73d3f9e0627f77c9"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.640599 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" event={"ID":"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2","Type":"ContainerStarted","Data":"c2980586c43cd452607f5994beffc328c304211265250ab7fe76e46adc4d95f9"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.655225 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" event={"ID":"c8c70167-b995-419b-afbe-a58dd78ebe48","Type":"ContainerStarted","Data":"c6ee61206ab28b87a75075f0062976c8c91b5b4f6cf03bdacacc20def98b7127"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.655545 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" event={"ID":"c8c70167-b995-419b-afbe-a58dd78ebe48","Type":"ContainerStarted","Data":"3f520511bb465f799ffb7303ff449776672504f991f7f8a429567ee79c45a90a"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.669536 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pc6pd" event={"ID":"5205eeb0-f3c3-460a-8512-0af719bbf18a","Type":"ContainerStarted","Data":"4278469fd33de092e41d1df322fc99e7d7324d5b9d46f0a52f2a5196af611ee3"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.670273 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pc6pd" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.678390 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.678793 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.178775805 +0000 UTC m=+157.004161555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.679253 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.680773 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.180764505 +0000 UTC m=+157.006150255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.727413 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pc6pd" podStartSLOduration=6.7273954719999995 podStartE2EDuration="6.727395472s" podCreationTimestamp="2025-10-08 21:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.726624272 +0000 UTC m=+156.552010032" watchObservedRunningTime="2025-10-08 21:50:56.727395472 +0000 UTC m=+156.552781222" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.729455 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kw88q" podStartSLOduration=129.729447813 podStartE2EDuration="2m9.729447813s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.688369235 +0000 UTC m=+156.513754985" watchObservedRunningTime="2025-10-08 21:50:56.729447813 +0000 UTC m=+156.554833563" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.774937 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" event={"ID":"6eab7118-dbb8-4fae-a999-f50fe9cafd3d","Type":"ContainerStarted","Data":"e03221ad961974a61a14aa00351015879c003e34010d894277185d630f59f37e"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.779998 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.780259 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.280243275 +0000 UTC m=+157.105629025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.785717 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-klt8b" podStartSLOduration=129.785702472 podStartE2EDuration="2m9.785702472s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.771379312 +0000 UTC m=+156.596765062" watchObservedRunningTime="2025-10-08 21:50:56.785702472 +0000 UTC m=+156.611088222" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.805039 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bdt5r" event={"ID":"6752e659-e823-46c0-bf41-a65c51d20b88","Type":"ContainerStarted","Data":"5aae8a030822fff031bdc0d132f43b3b74ffb52f7a755a62b8585e1d8605e821"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.835604 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:56 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:56 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:56 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.835662 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.860075 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" event={"ID":"c3738927-508b-431d-b0a9-8762d261f7f5","Type":"ContainerStarted","Data":"e377bf01e56c8c6fb42fd2892c58aec9b7533102327b6488ff880a88b4d7977f"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.892166 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.893667 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.393649903 +0000 UTC m=+157.219035653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.949103 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" event={"ID":"602cffd4-e6ff-442c-80e1-2122748972d1","Type":"ContainerStarted","Data":"e5da904e2aaf7306a52f26bb182d32492877eb9d8436230ee58d76e38b48d30f"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.949415 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.967544 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" event={"ID":"998da9f7-3775-47d2-a5ab-6848f9ecc779","Type":"ContainerStarted","Data":"eba83c2071604181306171e76d58c8c40edfd31c53456a943e1665f295bdc774"} Oct 08 21:50:56 crc kubenswrapper[4739]: I1008 21:50:56.994733 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:56 crc kubenswrapper[4739]: E1008 21:50:56.996075 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.496053286 +0000 UTC m=+157.321439036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.008899 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bdt5r" podStartSLOduration=7.008880377 podStartE2EDuration="7.008880377s" podCreationTimestamp="2025-10-08 21:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.916278159 +0000 UTC m=+156.741663909" watchObservedRunningTime="2025-10-08 21:50:57.008880377 +0000 UTC m=+156.834266127" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.009342 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5zgjg" podStartSLOduration=130.009338388 podStartE2EDuration="2m10.009338388s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:56.977564383 +0000 UTC m=+156.802950133" watchObservedRunningTime="2025-10-08 21:50:57.009338388 +0000 UTC m=+156.834724138" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.013033 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.030121 4739 generic.go:334] "Generic (PLEG): container finished" podID="d5a65c5e-2929-4d8d-bebb-67af1702dbd4" containerID="3ab5558af89017756bcbc32cac3ff399efd75d13e1ec09030b876ce3f346f282" exitCode=0 Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.030230 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" event={"ID":"d5a65c5e-2929-4d8d-bebb-67af1702dbd4","Type":"ContainerDied","Data":"3ab5558af89017756bcbc32cac3ff399efd75d13e1ec09030b876ce3f346f282"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.036401 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmrdt" podStartSLOduration=130.036382635 podStartE2EDuration="2m10.036382635s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.035761129 +0000 UTC m=+156.861146879" watchObservedRunningTime="2025-10-08 21:50:57.036382635 +0000 UTC m=+156.861768385" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.057701 4739 generic.go:334] "Generic (PLEG): container finished" podID="0e25dea3-95bd-4316-baef-f9bf7726b8e7" containerID="0697754f8a4a8387b37ba72d6021f394d859abe4641b421d9111336c083af3e9" exitCode=0 Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.057760 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" event={"ID":"0e25dea3-95bd-4316-baef-f9bf7726b8e7","Type":"ContainerDied","Data":"0697754f8a4a8387b37ba72d6021f394d859abe4641b421d9111336c083af3e9"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.057784 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" event={"ID":"0e25dea3-95bd-4316-baef-f9bf7726b8e7","Type":"ContainerStarted","Data":"8bcf6d1c5ae85571e80ce7471aadada724a97f9823ee98ecbf8a826e7ce23e83"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.066695 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" event={"ID":"41b4cb00-53d9-4481-9d7c-3368a9d8d9f3","Type":"ContainerStarted","Data":"eaebabe100546a5433e54bf0244a2a7ad7ba28be9217992d9d4a8c3c1f4c366d"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.081508 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4mkhn" event={"ID":"ff6946a6-cb0c-4be1-ba1d-319a64b6eba3","Type":"ContainerStarted","Data":"bfc84da2cbafea7674dc5aae046648a87228156a1999e69892bc8e934ed78e1b"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.097971 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.099236 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.599224218 +0000 UTC m=+157.424609958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.101130 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" event={"ID":"9bab544b-7e6a-4781-a7ce-73a6a40fe752","Type":"ContainerStarted","Data":"54501adc1ec49f9818573542682172ead80ea9b927fa8095013c8e3d509b7478"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.200069 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.201092 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.701072896 +0000 UTC m=+157.526458646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.224018 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" podStartSLOduration=130.22399827 podStartE2EDuration="2m10.22399827s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.162362328 +0000 UTC m=+156.987748078" watchObservedRunningTime="2025-10-08 21:50:57.22399827 +0000 UTC m=+157.049384020" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.236466 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qdvnh" event={"ID":"f6fdf42e-8623-4102-8d38-c22c6c3d6978","Type":"ContainerStarted","Data":"bb654af0e28fad31721df732c99f9c7cef47ab5cbcb1bd559acaa09ca060cf99"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.236503 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qdvnh" event={"ID":"f6fdf42e-8623-4102-8d38-c22c6c3d6978","Type":"ContainerStarted","Data":"55c8e7e1e7b63cacbe4fe63f2fae768f9bd9a7c5587fbc2caaac5f779e8be48a"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.236516 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.247301 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.247358 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.298454 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n8lrt" event={"ID":"e3d665b0-ab57-47b7-9a58-9c6c150d6105","Type":"ContainerStarted","Data":"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.298504 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n8lrt" event={"ID":"e3d665b0-ab57-47b7-9a58-9c6c150d6105","Type":"ContainerStarted","Data":"34780b843b9b80baf695a65c9e62cfa5c19dfd149662c3f9b85021f5a17d2d8c"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.300748 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mgssn" podStartSLOduration=130.3007355 podStartE2EDuration="2m10.3007355s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.286548855 +0000 UTC m=+157.111934605" watchObservedRunningTime="2025-10-08 21:50:57.3007355 +0000 UTC m=+157.126121250" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.301891 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.317441 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.817427418 +0000 UTC m=+157.642813168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.325258 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qdvnh" podStartSLOduration=130.325240364 podStartE2EDuration="2m10.325240364s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.322875434 +0000 UTC m=+157.148261204" watchObservedRunningTime="2025-10-08 21:50:57.325240364 +0000 UTC m=+157.150626114" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.339792 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" event={"ID":"015086a5-5aff-4732-a198-26d0b29d1253","Type":"ContainerStarted","Data":"f7902bf36ff811798f4f857fe848eec0552cf9ecc7d5fd197c7b052bcfa6ef57"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.386223 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" event={"ID":"dbfbbfca-3095-4a7b-869e-70b1a86046c4","Type":"ContainerStarted","Data":"a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.390940 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-n8lrt" podStartSLOduration=131.390918348 podStartE2EDuration="2m11.390918348s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.387636176 +0000 UTC m=+157.213021926" watchObservedRunningTime="2025-10-08 21:50:57.390918348 +0000 UTC m=+157.216304098" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.398865 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" event={"ID":"26f8967c-81df-4899-b6d5-825e39644e01","Type":"ContainerStarted","Data":"1350cc4a0f3fb1c7c3520cf283911a925de43c53d34a892c324634ef56368563"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.399686 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.400990 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" event={"ID":"9d8b872b-6b5c-4091-8a35-8f2bd3257243","Type":"ContainerStarted","Data":"e44aa7ade74ae6be6ff01b8fead68f287eefc54feee476c7b1d87778b96ad1d8"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.401013 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" event={"ID":"9d8b872b-6b5c-4091-8a35-8f2bd3257243","Type":"ContainerStarted","Data":"60db5a72ba5bdfe6fb6ffcb8098b7b36f7704ca3aeee4d5e3f937530cf3de43c"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.402601 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.402908 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.902895037 +0000 UTC m=+157.728280787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.402987 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.403119 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" event={"ID":"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b","Type":"ContainerStarted","Data":"a9a48406ac5526bc9d6bbabe675fbaa0ac6f953a6d7b594e9711eb376ddd6fd1"} Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.405049 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:57.905035311 +0000 UTC m=+157.730421061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.419649 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" event={"ID":"ce03f742-2910-4d6c-af9b-97abf28c6fbc","Type":"ContainerStarted","Data":"e77de28f39851e6eda2ff68d3d8d242a3a4cea92444e93d22f7dc1d641c038e1"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.432283 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-884f6" event={"ID":"295df770-129a-4aca-a189-3f4d77e97fc9","Type":"ContainerStarted","Data":"7afb392944dd268db9bb07d172c4d51aad14aefa5fe4c5c1db7cf1c880501879"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.433118 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.442289 4739 patch_prober.go:28] interesting pod/console-operator-58897d9998-884f6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.442341 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-884f6" podUID="295df770-129a-4aca-a189-3f4d77e97fc9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.454515 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" event={"ID":"9cc05cba-c864-4438-9740-d9be599131d7","Type":"ContainerStarted","Data":"cafc9c3434146772a318fe677fee3348f19bce19556190f39a8fad289c1d1f82"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.454902 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" event={"ID":"9cc05cba-c864-4438-9740-d9be599131d7","Type":"ContainerStarted","Data":"9a8a359392cde53b29d4cfe8dd785ba22907808cd65b6842a4e7241d7bd0b933"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.481951 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" event={"ID":"abd1c1de-f12b-48d3-9687-54025a7daa56","Type":"ContainerStarted","Data":"648fbe96bc6bdba6d99085354cf1182a51aea5b6523e6b3bf4c376d127e62f2c"} Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.484005 4739 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qpcfx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.484049 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.503582 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.503928 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.003914026 +0000 UTC m=+157.829299776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.511493 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" podStartSLOduration=130.511475455 podStartE2EDuration="2m10.511475455s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.504139971 +0000 UTC m=+157.329525721" watchObservedRunningTime="2025-10-08 21:50:57.511475455 +0000 UTC m=+157.336861205" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.512059 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" podStartSLOduration=130.512053469 podStartE2EDuration="2m10.512053469s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.444739264 +0000 UTC m=+157.270125014" watchObservedRunningTime="2025-10-08 21:50:57.512053469 +0000 UTC m=+157.337439219" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.530113 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hc74g" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.552383 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-884f6" podStartSLOduration=131.552368478 podStartE2EDuration="2m11.552368478s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.551588489 +0000 UTC m=+157.376974239" watchObservedRunningTime="2025-10-08 21:50:57.552368478 +0000 UTC m=+157.377754228" Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.609660 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.109647201 +0000 UTC m=+157.935032951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.609137 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.631353 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tqmgf" podStartSLOduration=131.631336505 podStartE2EDuration="2m11.631336505s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.631035996 +0000 UTC m=+157.456421746" watchObservedRunningTime="2025-10-08 21:50:57.631336505 +0000 UTC m=+157.456722255" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.631904 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" podStartSLOduration=130.631899378 podStartE2EDuration="2m10.631899378s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.604035311 +0000 UTC m=+157.429421061" watchObservedRunningTime="2025-10-08 21:50:57.631899378 +0000 UTC m=+157.457285128" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.654403 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.658776 4739 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-hsz75 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.658813 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" podUID="d5a65c5e-2929-4d8d-bebb-67af1702dbd4" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.658886 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.697938 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8c2dn" podStartSLOduration=131.69792289 podStartE2EDuration="2m11.69792289s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:57.663296344 +0000 UTC m=+157.488682094" watchObservedRunningTime="2025-10-08 21:50:57.69792289 +0000 UTC m=+157.523308640" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.725005 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.725396 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.225379018 +0000 UTC m=+158.050764768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.827402 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.827818 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.327802051 +0000 UTC m=+158.153187801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.834397 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:57 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:57 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:57 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.834460 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.929344 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.929532 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.429506117 +0000 UTC m=+158.254891867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:57 crc kubenswrapper[4739]: I1008 21:50:57.929665 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:57 crc kubenswrapper[4739]: E1008 21:50:57.929980 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.429971208 +0000 UTC m=+158.255356958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.031051 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.031321 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.531295513 +0000 UTC m=+158.356681263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.031377 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.031719 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.531711345 +0000 UTC m=+158.357097085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.132606 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.132814 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.632785964 +0000 UTC m=+158.458171714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.132881 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.133176 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.633161683 +0000 UTC m=+158.458547433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.234210 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.234410 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.734383776 +0000 UTC m=+158.559769526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.234512 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.234770 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.734763356 +0000 UTC m=+158.560149106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.335852 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.336081 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.836047901 +0000 UTC m=+158.661433661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.336315 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.336588 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.836578513 +0000 UTC m=+158.661964263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.437338 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.437552 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.93752582 +0000 UTC m=+158.762911570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.437608 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.437888 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:58.937876549 +0000 UTC m=+158.763262299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.506329 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" event={"ID":"0e25dea3-95bd-4316-baef-f9bf7726b8e7","Type":"ContainerStarted","Data":"3bf84866faec855e2a1edce07d0cf2cc7485ffb516caa367f141b2b820f841dc"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.506668 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.509444 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" event={"ID":"1d2e6b08-794d-4c6e-8dd5-1197ce2175a2","Type":"ContainerStarted","Data":"618b1ca4dd2e9c93569440289b35d9d45e5eea9337d58f1e3e1cca6ae2313ef6"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.524400 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" event={"ID":"dbfbbfca-3095-4a7b-869e-70b1a86046c4","Type":"ContainerStarted","Data":"6f67e28d93ba7dcaa2e33ca721836c5d3d40506f1f2686e98b7538e9908c1374"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.527251 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" event={"ID":"9bab544b-7e6a-4781-a7ce-73a6a40fe752","Type":"ContainerStarted","Data":"e1aa59ec9492f4628670bfa0fe335768bfa567ca016db89ce415c46ae034489d"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.528178 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.529832 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" event={"ID":"ce03f742-2910-4d6c-af9b-97abf28c6fbc","Type":"ContainerStarted","Data":"bad6c7897f04f4de73485849258cf80195ef7d00238451c2d3bfd3770152e7cd"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.535784 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" event={"ID":"c88feee0-f87d-4234-a9f7-15bd3e2c8d0f","Type":"ContainerStarted","Data":"925130d169962d8d333cce92446bf021b5a73cae5ba169eb0118bd7f22b1a4ef"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.539510 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.540021 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.039991455 +0000 UTC m=+158.865377205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.542025 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" podStartSLOduration=132.542001705 podStartE2EDuration="2m12.542001705s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.534649951 +0000 UTC m=+158.360035721" watchObservedRunningTime="2025-10-08 21:50:58.542001705 +0000 UTC m=+158.367387455" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.542507 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" event={"ID":"6eab7118-dbb8-4fae-a999-f50fe9cafd3d","Type":"ContainerStarted","Data":"8266dd4f462e33cd8762d010ba1ec5bd74f1e7b8c8b688866f6c6c26adb5d9d4"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.549941 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" event={"ID":"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b","Type":"ContainerStarted","Data":"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.550549 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.552590 4739 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7fzvv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.552637 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.559345 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" event={"ID":"99d4be11-0ed9-432b-8300-51e96e354634","Type":"ContainerStarted","Data":"1b83d8ec1eba117d09cd96309a14a14880da21664433d53d1c3deff04ad7477f"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.572657 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" event={"ID":"26f8967c-81df-4899-b6d5-825e39644e01","Type":"ContainerStarted","Data":"64e1e7b560803d672f68063270c731ff8e6be0b511cbb0e31702ea13b5d86c75"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.574393 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" podStartSLOduration=132.574379415 podStartE2EDuration="2m12.574379415s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.566001275 +0000 UTC m=+158.391387025" watchObservedRunningTime="2025-10-08 21:50:58.574379415 +0000 UTC m=+158.399765165" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.578508 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" event={"ID":"998da9f7-3775-47d2-a5ab-6848f9ecc779","Type":"ContainerStarted","Data":"5cebbca99873dad8160d7439b9b941b7c89995024ad0130798d5347647f6d795"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.589284 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" event={"ID":"d5a65c5e-2929-4d8d-bebb-67af1702dbd4","Type":"ContainerStarted","Data":"ed8dd0ab344296021990217a808715b7f2ca8721dc46e98e1a32a6994dca6086"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.593569 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" podStartSLOduration=131.593550125 podStartE2EDuration="2m11.593550125s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.59217356 +0000 UTC m=+158.417559320" watchObservedRunningTime="2025-10-08 21:50:58.593550125 +0000 UTC m=+158.418935875" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.600609 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" event={"ID":"abd1c1de-f12b-48d3-9687-54025a7daa56","Type":"ContainerStarted","Data":"72ca18c0a045e45ce654e20f2e46c5d62e054697962cc06626cb6d1d07fb8df2"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.600654 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" event={"ID":"abd1c1de-f12b-48d3-9687-54025a7daa56","Type":"ContainerStarted","Data":"27107a089ee5dd2c93499c8668f7bb8c71e150489f2e75822adae354bdfd6ec9"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.606106 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hmj4f" event={"ID":"015086a5-5aff-4732-a198-26d0b29d1253","Type":"ContainerStarted","Data":"7d2fd0817988e906077fdf813137ccd1f07a21d7bfb1eca81198be4946941779"} Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.608279 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.608355 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.613577 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lt4w6" podStartSLOduration=131.613555335 podStartE2EDuration="2m11.613555335s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.611200096 +0000 UTC m=+158.436585836" watchObservedRunningTime="2025-10-08 21:50:58.613555335 +0000 UTC m=+158.438941105" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.636058 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tcvn" podStartSLOduration=131.636043638 podStartE2EDuration="2m11.636043638s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.633980127 +0000 UTC m=+158.459365877" watchObservedRunningTime="2025-10-08 21:50:58.636043638 +0000 UTC m=+158.461429388" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.642316 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.647022 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.147007653 +0000 UTC m=+158.972393403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.720199 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" podStartSLOduration=132.720183494 podStartE2EDuration="2m12.720183494s" podCreationTimestamp="2025-10-08 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.69963362 +0000 UTC m=+158.525019370" watchObservedRunningTime="2025-10-08 21:50:58.720183494 +0000 UTC m=+158.545569244" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.721212 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m47gp" podStartSLOduration=131.721207269 podStartE2EDuration="2m11.721207269s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.719602779 +0000 UTC m=+158.544988529" watchObservedRunningTime="2025-10-08 21:50:58.721207269 +0000 UTC m=+158.546593019" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.744705 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.745004 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.244988775 +0000 UTC m=+159.070374525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.756572 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rkgxg" podStartSLOduration=131.756555514 podStartE2EDuration="2m11.756555514s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.755262401 +0000 UTC m=+158.580648151" watchObservedRunningTime="2025-10-08 21:50:58.756555514 +0000 UTC m=+158.581941274" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.829714 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:58 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:58 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:58 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.829969 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.847103 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.847454 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.347442759 +0000 UTC m=+159.172828509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.906252 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8679r" podStartSLOduration=131.90623695 podStartE2EDuration="2m11.90623695s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:50:58.793309553 +0000 UTC m=+158.618695303" watchObservedRunningTime="2025-10-08 21:50:58.90623695 +0000 UTC m=+158.731622700" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.906834 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.909026 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.911659 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.937224 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.953566 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.953842 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6twt4\" (UniqueName: \"kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.953871 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.953931 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:58 crc kubenswrapper[4739]: E1008 21:50:58.954084 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.454070657 +0000 UTC m=+159.279456407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:58 crc kubenswrapper[4739]: I1008 21:50:58.960919 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-884f6" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.056896 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.056948 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.057016 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6twt4\" (UniqueName: \"kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.057043 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.057497 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.057738 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.058000 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.557987148 +0000 UTC m=+159.383372898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.095056 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6twt4\" (UniqueName: \"kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4\") pod \"certified-operators-867zz\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.123343 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.126550 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.126869 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.138379 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.158038 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.158260 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.158286 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndgf\" (UniqueName: \"kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.158323 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.158455 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.658439721 +0000 UTC m=+159.483825471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.241433 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.259580 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.259620 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndgf\" (UniqueName: \"kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.259644 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.259666 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.260043 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.260313 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.760298321 +0000 UTC m=+159.585684071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.260522 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.286177 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.286957 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndgf\" (UniqueName: \"kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf\") pod \"community-operators-lnwlb\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.287548 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.299577 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.364634 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.364836 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.364902 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.364922 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98g5v\" (UniqueName: \"kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.365072 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.865028982 +0000 UTC m=+159.690414732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.460706 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.465990 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.466034 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.466118 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.466161 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98g5v\" (UniqueName: \"kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.466816 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.466844 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.468632 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:50:59.968233505 +0000 UTC m=+159.793619255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.496499 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.497454 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.503503 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98g5v\" (UniqueName: \"kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v\") pod \"certified-operators-7wh9v\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.509636 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.530770 4739 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8fsdr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.530826 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" podUID="9bab544b-7e6a-4781-a7ce-73a6a40fe752" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.563287 4739 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.567529 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.567742 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m68qp\" (UniqueName: \"kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.567774 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.567815 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.567954 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.06793892 +0000 UTC m=+159.893324670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.648682 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" event={"ID":"998da9f7-3775-47d2-a5ab-6848f9ecc779","Type":"ContainerStarted","Data":"96aa5417680d6a164247b6f1f383f0dcd262d57c7ccf9fb229e3d93b298c904f"} Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.650553 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" event={"ID":"998da9f7-3775-47d2-a5ab-6848f9ecc779","Type":"ContainerStarted","Data":"62f9fe90a49d2eec5a114a66fa1492b3652f7275fcb1e055387406b0e6c80539"} Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.651464 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.658516 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.671061 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.671107 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.671185 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.671936 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m68qp\" (UniqueName: \"kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.673435 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.17342009 +0000 UTC m=+159.998805940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.677294 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.685895 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.717203 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m68qp\" (UniqueName: \"kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp\") pod \"community-operators-5vx9b\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.765002 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8fsdr" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.772701 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.773004 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.272984601 +0000 UTC m=+160.098370351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.832753 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:50:59 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:50:59 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:50:59 crc kubenswrapper[4739]: healthz check failed Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.832810 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.868056 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.874965 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.875319 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.375304632 +0000 UTC m=+160.200690382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.904247 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:50:59 crc kubenswrapper[4739]: W1008 21:50:59.908255 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac894a1_e4ba_4a6b_8ac2_3a24ef886834.slice/crio-f18b27585b4699f1ea0a9d2bf803715b4bb5ba4fa40f8feb9c5034c8bd045a79 WatchSource:0}: Error finding container f18b27585b4699f1ea0a9d2bf803715b4bb5ba4fa40f8feb9c5034c8bd045a79: Status 404 returned error can't find the container with id f18b27585b4699f1ea0a9d2bf803715b4bb5ba4fa40f8feb9c5034c8bd045a79 Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.961272 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:50:59 crc kubenswrapper[4739]: I1008 21:50:59.981247 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:50:59 crc kubenswrapper[4739]: E1008 21:50:59.981561 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.481544971 +0000 UTC m=+160.306930711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.033771 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.046516 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.048665 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.056723 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.056940 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.082456 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.082499 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.082533 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:00 crc kubenswrapper[4739]: E1008 21:51:00.082842 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 21:51:00.582830395 +0000 UTC m=+160.408216145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl6f2" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.131219 4739 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-08T21:50:59.563312004Z","Handler":null,"Name":""} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.138094 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.149327 4739 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.149366 4739 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.184704 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.184972 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.185018 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.185592 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.207517 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.225876 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.286304 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.290877 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.290917 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.307966 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.411112 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.444997 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl6f2\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.609636 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.661090 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" event={"ID":"998da9f7-3775-47d2-a5ab-6848f9ecc779","Type":"ContainerStarted","Data":"45e37e0072f92710c04fcc9d0ef2a37b35c45b1830eb7b829805dcd6898413b8"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.662831 4739 generic.go:334] "Generic (PLEG): container finished" podID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerID="d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5" exitCode=0 Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.663127 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerDied","Data":"d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.663200 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerStarted","Data":"f18b27585b4699f1ea0a9d2bf803715b4bb5ba4fa40f8feb9c5034c8bd045a79"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.664566 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.664569 4739 generic.go:334] "Generic (PLEG): container finished" podID="c3158bce-349e-4143-93e7-42fd8f486e65" containerID="3f3e80b9e2930d025839974d836568c1a2303b56ad0826e0ab438b445b125346" exitCode=0 Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.664627 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerDied","Data":"3f3e80b9e2930d025839974d836568c1a2303b56ad0826e0ab438b445b125346"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.664651 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerStarted","Data":"0fbf18683eed9e130a1f14e64919d637470401b5622195a15400f31fe64e7a33"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.666349 4739 generic.go:334] "Generic (PLEG): container finished" podID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerID="4563c57ffc125df447a465af3152b0cbb9e55a30787afaa200a55cd84d711d1b" exitCode=0 Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.666393 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerDied","Data":"4563c57ffc125df447a465af3152b0cbb9e55a30787afaa200a55cd84d711d1b"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.666424 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerStarted","Data":"45b9ca19918cf5164fe82ffd628a0516b2fd79cd47daf70197bc247cfc181fbd"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.669338 4739 generic.go:334] "Generic (PLEG): container finished" podID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerID="5b23ffdeed6fd2d1472b5e8ce2ba74a283f94caf2ecbe03ccdb72d816dceb75c" exitCode=0 Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.669910 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerDied","Data":"5b23ffdeed6fd2d1472b5e8ce2ba74a283f94caf2ecbe03ccdb72d816dceb75c"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.669949 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerStarted","Data":"6cf51f85c9f25f4c7acdb032e5ba92562efb8dcb697c40c2ea0225146a0900fc"} Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.687350 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bdrcp" podStartSLOduration=10.687329504000001 podStartE2EDuration="10.687329504s" podCreationTimestamp="2025-10-08 21:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:51:00.683475527 +0000 UTC m=+160.508861287" watchObservedRunningTime="2025-10-08 21:51:00.687329504 +0000 UTC m=+160.512715254" Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.796556 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.828156 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.833944 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:00 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:00 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:00 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:00 crc kubenswrapper[4739]: I1008 21:51:00.834011 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.074648 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.075731 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.077952 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.085821 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.142779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5gb9\" (UniqueName: \"kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.142828 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.142858 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.244299 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5gb9\" (UniqueName: \"kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.244357 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.244377 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.244939 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.245430 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.263020 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5gb9\" (UniqueName: \"kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9\") pod \"redhat-marketplace-sjq82\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.387770 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.521388 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.522563 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.524702 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.548784 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.548878 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvzgz\" (UniqueName: \"kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.549112 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651096 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651168 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvzgz\" (UniqueName: \"kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651263 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651393 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651809 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.651853 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.671207 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvzgz\" (UniqueName: \"kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz\") pod \"redhat-marketplace-67xr2\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.678649 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerStarted","Data":"226ea1519865dc1be152e53037f42584ce1d3a2a236a8d147c4abcaa957d091c"} Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.679848 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" event={"ID":"595d6e92-80ce-40bf-8409-b50226a672ab","Type":"ContainerStarted","Data":"22641bbc692ff46e49c8eeb8d9ce8dadb60c200019458f63dbc0f675bd5b7f49"} Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.679867 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" event={"ID":"595d6e92-80ce-40bf-8409-b50226a672ab","Type":"ContainerStarted","Data":"475bf8bc6db1943d4d20b389ca2b348d343a490ccedc9b81d9bc29860e3a2bdf"} Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.680591 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.681932 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a935283-96a7-4954-85be-1b1aecb2d14c","Type":"ContainerStarted","Data":"aa4dd33ae6821e5f1da5e3f8f86c2f2545e21638e5711cf436171d25c2c2892b"} Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.682019 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a935283-96a7-4954-85be-1b1aecb2d14c","Type":"ContainerStarted","Data":"7703ff9670be35c572d0b65c8bd96c80119a023ebdf7f32c2f2ed8404dfd4dbd"} Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.701891 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" podStartSLOduration=134.701872374 podStartE2EDuration="2m14.701872374s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:51:01.701762562 +0000 UTC m=+161.527148322" watchObservedRunningTime="2025-10-08 21:51:01.701872374 +0000 UTC m=+161.527258124" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.825709 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:01 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:01 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:01 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.826018 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.827488 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.837821 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:51:01 crc kubenswrapper[4739]: I1008 21:51:01.851336 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.851316744 podStartE2EDuration="1.851316744s" podCreationTimestamp="2025-10-08 21:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:51:01.716594002 +0000 UTC m=+161.541979752" watchObservedRunningTime="2025-10-08 21:51:01.851316744 +0000 UTC m=+161.676702494" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.009114 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.075922 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.077069 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.079754 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.084967 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.156579 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.156702 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.156723 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwhsj\" (UniqueName: \"kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.262907 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.263084 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.263119 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwhsj\" (UniqueName: \"kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.264507 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.264814 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.286907 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwhsj\" (UniqueName: \"kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj\") pod \"redhat-operators-xcwbd\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.345457 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.345645 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.353529 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.407596 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.479804 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.482251 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kv4v6" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.482436 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.486521 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.569437 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.569521 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr47m\" (UniqueName: \"kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.569793 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.643355 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.659304 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.668676 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hsz75" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.671607 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.671724 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.671789 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr47m\" (UniqueName: \"kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.672687 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.672805 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.692281 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr47m\" (UniqueName: \"kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m\") pod \"redhat-operators-pqt7t\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.698660 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.699289 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerStarted","Data":"bccb0ff354a8b3234fa417e9a1e267d410a4c1b3a58400880278c7f7ef4564e2"} Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.704913 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerStarted","Data":"eec4a15ad77f8ec425bac82a39a64217fc2b248b3c773ec02d0431d1ed0d8e76"} Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.726584 4739 generic.go:334] "Generic (PLEG): container finished" podID="6a935283-96a7-4954-85be-1b1aecb2d14c" containerID="aa4dd33ae6821e5f1da5e3f8f86c2f2545e21638e5711cf436171d25c2c2892b" exitCode=0 Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.726781 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a935283-96a7-4954-85be-1b1aecb2d14c","Type":"ContainerDied","Data":"aa4dd33ae6821e5f1da5e3f8f86c2f2545e21638e5711cf436171d25c2c2892b"} Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.730981 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-dv7wz" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.806458 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.822680 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.828821 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:02 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:02 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:02 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:02 crc kubenswrapper[4739]: I1008 21:51:02.828867 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.078487 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.081843 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:51:03 crc kubenswrapper[4739]: E1008 21:51:03.243879 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13bcd021_6f45_4843_9255_92ee1ad4e031.slice/crio-conmon-bb781cf719e79b4dda0d742b91c634111e7fd1cbd74e9b8a07281b637c18b937.scope\": RecentStats: unable to find data in memory cache]" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.446663 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.447126 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.446950 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.447234 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.507616 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.509748 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.514298 4739 patch_prober.go:28] interesting pod/console-f9d7485db-n8lrt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.514376 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n8lrt" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.744887 4739 generic.go:334] "Generic (PLEG): container finished" podID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerID="73c1a39610c4e1e203b2cb69f9282fdc4af1c67867d639630705b636d0786232" exitCode=0 Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.744989 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerDied","Data":"73c1a39610c4e1e203b2cb69f9282fdc4af1c67867d639630705b636d0786232"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.753118 4739 generic.go:334] "Generic (PLEG): container finished" podID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerID="4471660f8cca02b31a670d596bd2e3275eb5dbdac0e806d97548ec7fd6755e55" exitCode=0 Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.753529 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerDied","Data":"4471660f8cca02b31a670d596bd2e3275eb5dbdac0e806d97548ec7fd6755e55"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.756685 4739 generic.go:334] "Generic (PLEG): container finished" podID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerID="bb781cf719e79b4dda0d742b91c634111e7fd1cbd74e9b8a07281b637c18b937" exitCode=0 Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.757063 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerDied","Data":"bb781cf719e79b4dda0d742b91c634111e7fd1cbd74e9b8a07281b637c18b937"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.765984 4739 generic.go:334] "Generic (PLEG): container finished" podID="d5d5effe-9f0b-4823-9365-e07862452e39" containerID="c0494c224f5afc809be5ec9f40bea9117b69195ea882ab16c451a09059e1f613" exitCode=0 Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.766249 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerDied","Data":"c0494c224f5afc809be5ec9f40bea9117b69195ea882ab16c451a09059e1f613"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.766280 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerStarted","Data":"60e04a440919c51952f726aea48cbbc06ed52356c6ce4efc83e441c97bf6e6cc"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.786644 4739 generic.go:334] "Generic (PLEG): container finished" podID="dbfbbfca-3095-4a7b-869e-70b1a86046c4" containerID="6f67e28d93ba7dcaa2e33ca721836c5d3d40506f1f2686e98b7538e9908c1374" exitCode=0 Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.786685 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" event={"ID":"dbfbbfca-3095-4a7b-869e-70b1a86046c4","Type":"ContainerDied","Data":"6f67e28d93ba7dcaa2e33ca721836c5d3d40506f1f2686e98b7538e9908c1374"} Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.795276 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.796173 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.801714 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.802104 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.812939 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.840158 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:03 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:03 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:03 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.840206 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.907675 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:03 crc kubenswrapper[4739]: I1008 21:51:03.907986 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.008988 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.009085 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.009431 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.059368 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.092785 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.109980 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access\") pod \"6a935283-96a7-4954-85be-1b1aecb2d14c\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.110105 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir\") pod \"6a935283-96a7-4954-85be-1b1aecb2d14c\" (UID: \"6a935283-96a7-4954-85be-1b1aecb2d14c\") " Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.110219 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a935283-96a7-4954-85be-1b1aecb2d14c" (UID: "6a935283-96a7-4954-85be-1b1aecb2d14c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.110535 4739 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a935283-96a7-4954-85be-1b1aecb2d14c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.123437 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.126974 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a935283-96a7-4954-85be-1b1aecb2d14c" (UID: "6a935283-96a7-4954-85be-1b1aecb2d14c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.211183 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a935283-96a7-4954-85be-1b1aecb2d14c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.737442 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pc6pd" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.805102 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.805460 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a935283-96a7-4954-85be-1b1aecb2d14c","Type":"ContainerDied","Data":"7703ff9670be35c572d0b65c8bd96c80119a023ebdf7f32c2f2ed8404dfd4dbd"} Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.805601 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7703ff9670be35c572d0b65c8bd96c80119a023ebdf7f32c2f2ed8404dfd4dbd" Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.836981 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.842217 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:04 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:04 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:04 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:04 crc kubenswrapper[4739]: I1008 21:51:04.842287 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.134114 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.241261 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnnbf\" (UniqueName: \"kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf\") pod \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.241648 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume\") pod \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.241781 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume\") pod \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\" (UID: \"dbfbbfca-3095-4a7b-869e-70b1a86046c4\") " Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.242761 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "dbfbbfca-3095-4a7b-869e-70b1a86046c4" (UID: "dbfbbfca-3095-4a7b-869e-70b1a86046c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.251442 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dbfbbfca-3095-4a7b-869e-70b1a86046c4" (UID: "dbfbbfca-3095-4a7b-869e-70b1a86046c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.251637 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf" (OuterVolumeSpecName: "kube-api-access-pnnbf") pod "dbfbbfca-3095-4a7b-869e-70b1a86046c4" (UID: "dbfbbfca-3095-4a7b-869e-70b1a86046c4"). InnerVolumeSpecName "kube-api-access-pnnbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.343438 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnnbf\" (UniqueName: \"kubernetes.io/projected/dbfbbfca-3095-4a7b-869e-70b1a86046c4-kube-api-access-pnnbf\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.343475 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbfbbfca-3095-4a7b-869e-70b1a86046c4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.343488 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbfbbfca-3095-4a7b-869e-70b1a86046c4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.836194 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:05 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:05 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:05 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.836248 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.849088 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f6bb85b-452f-4504-8859-efd8cd222178","Type":"ContainerStarted","Data":"cc18fa815947337304c2f18a8a1d325195da714452c1ea8ef237c3588ac706be"} Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.849136 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f6bb85b-452f-4504-8859-efd8cd222178","Type":"ContainerStarted","Data":"7641ac84445cff0f57bea676fb8a483edb80d6e9ee626c848a2b691f77fc2945"} Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.859136 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" event={"ID":"dbfbbfca-3095-4a7b-869e-70b1a86046c4","Type":"ContainerDied","Data":"a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8"} Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.859211 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.859216 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a429c2275757cf6d07836ae24590fa69492c483420634fdfc6515b2cc7ef64d8" Oct 08 21:51:05 crc kubenswrapper[4739]: I1008 21:51:05.874419 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.8743940759999997 podStartE2EDuration="2.874394076s" podCreationTimestamp="2025-10-08 21:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:51:05.868773096 +0000 UTC m=+165.694158846" watchObservedRunningTime="2025-10-08 21:51:05.874394076 +0000 UTC m=+165.699779826" Oct 08 21:51:06 crc kubenswrapper[4739]: I1008 21:51:06.824863 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:06 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:06 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:06 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:06 crc kubenswrapper[4739]: I1008 21:51:06.825157 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:06 crc kubenswrapper[4739]: I1008 21:51:06.870445 4739 generic.go:334] "Generic (PLEG): container finished" podID="5f6bb85b-452f-4504-8859-efd8cd222178" containerID="cc18fa815947337304c2f18a8a1d325195da714452c1ea8ef237c3588ac706be" exitCode=0 Oct 08 21:51:06 crc kubenswrapper[4739]: I1008 21:51:06.870500 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f6bb85b-452f-4504-8859-efd8cd222178","Type":"ContainerDied","Data":"cc18fa815947337304c2f18a8a1d325195da714452c1ea8ef237c3588ac706be"} Oct 08 21:51:07 crc kubenswrapper[4739]: I1008 21:51:07.824732 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:07 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:07 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:07 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:07 crc kubenswrapper[4739]: I1008 21:51:07.824787 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.117859 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.201953 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir\") pod \"5f6bb85b-452f-4504-8859-efd8cd222178\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.202066 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access\") pod \"5f6bb85b-452f-4504-8859-efd8cd222178\" (UID: \"5f6bb85b-452f-4504-8859-efd8cd222178\") " Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.202074 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5f6bb85b-452f-4504-8859-efd8cd222178" (UID: "5f6bb85b-452f-4504-8859-efd8cd222178"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.202575 4739 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f6bb85b-452f-4504-8859-efd8cd222178-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.208979 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5f6bb85b-452f-4504-8859-efd8cd222178" (UID: "5f6bb85b-452f-4504-8859-efd8cd222178"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.303939 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f6bb85b-452f-4504-8859-efd8cd222178-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.824657 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:08 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:08 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:08 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.824705 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.885587 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f6bb85b-452f-4504-8859-efd8cd222178","Type":"ContainerDied","Data":"7641ac84445cff0f57bea676fb8a483edb80d6e9ee626c848a2b691f77fc2945"} Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.885626 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7641ac84445cff0f57bea676fb8a483edb80d6e9ee626c848a2b691f77fc2945" Oct 08 21:51:08 crc kubenswrapper[4739]: I1008 21:51:08.885627 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 21:51:09 crc kubenswrapper[4739]: I1008 21:51:09.732553 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:51:09 crc kubenswrapper[4739]: I1008 21:51:09.750958 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8629e121-2c64-4b46-adbd-ec1433ec0835-metrics-certs\") pod \"network-metrics-daemon-kdt6j\" (UID: \"8629e121-2c64-4b46-adbd-ec1433ec0835\") " pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:51:09 crc kubenswrapper[4739]: I1008 21:51:09.826591 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:09 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:09 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:09 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:09 crc kubenswrapper[4739]: I1008 21:51:09.826883 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:09 crc kubenswrapper[4739]: I1008 21:51:09.944598 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdt6j" Oct 08 21:51:10 crc kubenswrapper[4739]: I1008 21:51:10.824832 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:10 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:10 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:10 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:10 crc kubenswrapper[4739]: I1008 21:51:10.824897 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:11 crc kubenswrapper[4739]: I1008 21:51:11.824473 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:11 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:11 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:11 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:11 crc kubenswrapper[4739]: I1008 21:51:11.824530 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:12 crc kubenswrapper[4739]: I1008 21:51:12.824613 4739 patch_prober.go:28] interesting pod/router-default-5444994796-zrm4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 21:51:12 crc kubenswrapper[4739]: [-]has-synced failed: reason withheld Oct 08 21:51:12 crc kubenswrapper[4739]: [+]process-running ok Oct 08 21:51:12 crc kubenswrapper[4739]: healthz check failed Oct 08 21:51:12 crc kubenswrapper[4739]: I1008 21:51:12.824673 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrm4p" podUID="44d375d6-9c45-466a-a423-b87b33bc63dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.446234 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.446577 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.446260 4739 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdvnh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.446686 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdvnh" podUID="f6fdf42e-8623-4102-8d38-c22c6c3d6978" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.507710 4739 patch_prober.go:28] interesting pod/console-f9d7485db-n8lrt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.507767 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n8lrt" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.829090 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:51:13 crc kubenswrapper[4739]: I1008 21:51:13.831449 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zrm4p" Oct 08 21:51:16 crc kubenswrapper[4739]: I1008 21:51:16.492626 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdt6j"] Oct 08 21:51:16 crc kubenswrapper[4739]: W1008 21:51:16.514187 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8629e121_2c64_4b46_adbd_ec1433ec0835.slice/crio-cd92da27fbd1cd12f26fb38448cc69309ab706ec4f7c012d8c6873afe9b4a483 WatchSource:0}: Error finding container cd92da27fbd1cd12f26fb38448cc69309ab706ec4f7c012d8c6873afe9b4a483: Status 404 returned error can't find the container with id cd92da27fbd1cd12f26fb38448cc69309ab706ec4f7c012d8c6873afe9b4a483 Oct 08 21:51:16 crc kubenswrapper[4739]: I1008 21:51:16.936719 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" event={"ID":"8629e121-2c64-4b46-adbd-ec1433ec0835","Type":"ContainerStarted","Data":"cd92da27fbd1cd12f26fb38448cc69309ab706ec4f7c012d8c6873afe9b4a483"} Oct 08 21:51:17 crc kubenswrapper[4739]: I1008 21:51:17.950836 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" event={"ID":"8629e121-2c64-4b46-adbd-ec1433ec0835","Type":"ContainerStarted","Data":"aa182fc31daec623dffcf59e769c096edcbf886b404a8c27f5376c0acd48483b"} Oct 08 21:51:18 crc kubenswrapper[4739]: I1008 21:51:18.959053 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdt6j" event={"ID":"8629e121-2c64-4b46-adbd-ec1433ec0835","Type":"ContainerStarted","Data":"76f5b70efc2fdc2186d88e7dd317ddc6825e7849f6426990a90852cad5c7098c"} Oct 08 21:51:18 crc kubenswrapper[4739]: I1008 21:51:18.981811 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kdt6j" podStartSLOduration=151.981788703 podStartE2EDuration="2m31.981788703s" podCreationTimestamp="2025-10-08 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:51:18.98082588 +0000 UTC m=+178.806211630" watchObservedRunningTime="2025-10-08 21:51:18.981788703 +0000 UTC m=+178.807174453" Oct 08 21:51:20 crc kubenswrapper[4739]: I1008 21:51:20.615455 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:51:21 crc kubenswrapper[4739]: I1008 21:51:21.765802 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:51:21 crc kubenswrapper[4739]: I1008 21:51:21.766051 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:51:23 crc kubenswrapper[4739]: I1008 21:51:23.451466 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qdvnh" Oct 08 21:51:23 crc kubenswrapper[4739]: I1008 21:51:23.516971 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:51:23 crc kubenswrapper[4739]: I1008 21:51:23.521197 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 21:51:29 crc kubenswrapper[4739]: I1008 21:51:29.093687 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 21:51:32 crc kubenswrapper[4739]: E1008 21:51:32.178491 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 21:51:32 crc kubenswrapper[4739]: E1008 21:51:32.179063 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98g5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7wh9v_openshift-marketplace(e59564a7-a703-4b18-a5cc-f231f8a8232b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:32 crc kubenswrapper[4739]: E1008 21:51:32.180293 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7wh9v" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" Oct 08 21:51:33 crc kubenswrapper[4739]: I1008 21:51:33.318377 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bqplh" Oct 08 21:51:39 crc kubenswrapper[4739]: E1008 21:51:39.491312 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7wh9v" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" Oct 08 21:51:41 crc kubenswrapper[4739]: E1008 21:51:41.120358 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 21:51:41 crc kubenswrapper[4739]: E1008 21:51:41.120595 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwhsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xcwbd_openshift-marketplace(13bcd021-6f45-4843-9255-92ee1ad4e031): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:41 crc kubenswrapper[4739]: E1008 21:51:41.122805 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xcwbd" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" Oct 08 21:51:43 crc kubenswrapper[4739]: E1008 21:51:43.141521 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 21:51:43 crc kubenswrapper[4739]: E1008 21:51:43.142084 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvzgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-67xr2_openshift-marketplace(04f484a1-1bcd-4168-bad7-3901c03a49e0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:43 crc kubenswrapper[4739]: E1008 21:51:43.143679 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-67xr2" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" Oct 08 21:51:43 crc kubenswrapper[4739]: E1008 21:51:43.891383 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xcwbd" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.104763 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-67xr2" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.122315 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.122569 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k5gb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sjq82_openshift-marketplace(deed15fa-28ce-4057-a1d7-5f920fa4751b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.123843 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sjq82" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.768123 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.768335 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bndgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lnwlb_openshift-marketplace(5ac894a1-e4ba-4a6b-8ac2-3a24ef886834): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:44 crc kubenswrapper[4739]: E1008 21:51:44.769653 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lnwlb" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" Oct 08 21:51:45 crc kubenswrapper[4739]: E1008 21:51:45.109829 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sjq82" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" Oct 08 21:51:45 crc kubenswrapper[4739]: E1008 21:51:45.110632 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lnwlb" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" Oct 08 21:51:47 crc kubenswrapper[4739]: E1008 21:51:47.423288 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 21:51:47 crc kubenswrapper[4739]: E1008 21:51:47.423697 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m68qp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5vx9b_openshift-marketplace(c3158bce-349e-4143-93e7-42fd8f486e65): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:47 crc kubenswrapper[4739]: E1008 21:51:47.424876 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5vx9b" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" Oct 08 21:51:48 crc kubenswrapper[4739]: E1008 21:51:48.127392 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5vx9b" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" Oct 08 21:51:48 crc kubenswrapper[4739]: E1008 21:51:48.997501 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 21:51:48 crc kubenswrapper[4739]: E1008 21:51:48.997883 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jr47m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pqt7t_openshift-marketplace(d5d5effe-9f0b-4823-9365-e07862452e39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:48 crc kubenswrapper[4739]: E1008 21:51:48.999068 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pqt7t" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" Oct 08 21:51:49 crc kubenswrapper[4739]: E1008 21:51:49.133256 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pqt7t" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" Oct 08 21:51:51 crc kubenswrapper[4739]: I1008 21:51:51.766567 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:51:51 crc kubenswrapper[4739]: I1008 21:51:51.766928 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:51:51 crc kubenswrapper[4739]: I1008 21:51:51.766974 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:51:51 crc kubenswrapper[4739]: I1008 21:51:51.767661 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:51:51 crc kubenswrapper[4739]: I1008 21:51:51.767817 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d" gracePeriod=600 Oct 08 21:51:53 crc kubenswrapper[4739]: E1008 21:51:53.243917 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 21:51:53 crc kubenswrapper[4739]: E1008 21:51:53.244965 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6twt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-867zz_openshift-marketplace(e75f5e41-357c-47c7-b000-1103fd2b9756): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 21:51:53 crc kubenswrapper[4739]: E1008 21:51:53.247057 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-867zz" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" Oct 08 21:51:54 crc kubenswrapper[4739]: I1008 21:51:54.157646 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d" exitCode=0 Oct 08 21:51:54 crc kubenswrapper[4739]: I1008 21:51:54.157727 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d"} Oct 08 21:51:54 crc kubenswrapper[4739]: E1008 21:51:54.853874 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-867zz" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" Oct 08 21:51:56 crc kubenswrapper[4739]: I1008 21:51:56.167764 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e"} Oct 08 21:51:56 crc kubenswrapper[4739]: I1008 21:51:56.169862 4739 generic.go:334] "Generic (PLEG): container finished" podID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerID="ae2174c43171f0b7ef11c40c51ca972671db52c15df1feb49b2114104bf1a873" exitCode=0 Oct 08 21:51:56 crc kubenswrapper[4739]: I1008 21:51:56.169890 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerDied","Data":"ae2174c43171f0b7ef11c40c51ca972671db52c15df1feb49b2114104bf1a873"} Oct 08 21:51:57 crc kubenswrapper[4739]: I1008 21:51:57.176794 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerStarted","Data":"2b34652737a090fa94dd0e43b4bca27c02d4e671060d13ff4342d0ed62f7ae40"} Oct 08 21:51:57 crc kubenswrapper[4739]: I1008 21:51:57.179359 4739 generic.go:334] "Generic (PLEG): container finished" podID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerID="0e4c8176b1ce15c744b4314e73dbbb3fe4c45c7962b63b640235d1a5ebd8eb9e" exitCode=0 Oct 08 21:51:57 crc kubenswrapper[4739]: I1008 21:51:57.179425 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerDied","Data":"0e4c8176b1ce15c744b4314e73dbbb3fe4c45c7962b63b640235d1a5ebd8eb9e"} Oct 08 21:51:57 crc kubenswrapper[4739]: I1008 21:51:57.194870 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7wh9v" podStartSLOduration=2.122965547 podStartE2EDuration="58.19485189s" podCreationTimestamp="2025-10-08 21:50:59 +0000 UTC" firstStartedPulling="2025-10-08 21:51:00.667725273 +0000 UTC m=+160.493111023" lastFinishedPulling="2025-10-08 21:51:56.739611616 +0000 UTC m=+216.564997366" observedRunningTime="2025-10-08 21:51:57.193216001 +0000 UTC m=+217.018601781" watchObservedRunningTime="2025-10-08 21:51:57.19485189 +0000 UTC m=+217.020237640" Oct 08 21:51:58 crc kubenswrapper[4739]: I1008 21:51:58.184837 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerStarted","Data":"7837c75d4069a21f27c5a27f628702931331b2ea59648b88899dffd958dacde3"} Oct 08 21:51:59 crc kubenswrapper[4739]: I1008 21:51:59.210017 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67xr2" podStartSLOduration=4.009362671 podStartE2EDuration="58.209996876s" podCreationTimestamp="2025-10-08 21:51:01 +0000 UTC" firstStartedPulling="2025-10-08 21:51:03.750966605 +0000 UTC m=+163.576352355" lastFinishedPulling="2025-10-08 21:51:57.95160082 +0000 UTC m=+217.776986560" observedRunningTime="2025-10-08 21:51:59.205463903 +0000 UTC m=+219.030849653" watchObservedRunningTime="2025-10-08 21:51:59.209996876 +0000 UTC m=+219.035382626" Oct 08 21:51:59 crc kubenswrapper[4739]: I1008 21:51:59.652157 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:51:59 crc kubenswrapper[4739]: I1008 21:51:59.652245 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:51:59 crc kubenswrapper[4739]: I1008 21:51:59.976593 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:52:01 crc kubenswrapper[4739]: I1008 21:52:01.204492 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerStarted","Data":"7dd972ba52238fd4767461789b7850de61261dfa34f682f32103c396bd6c7c3c"} Oct 08 21:52:01 crc kubenswrapper[4739]: I1008 21:52:01.206273 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerStarted","Data":"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c"} Oct 08 21:52:01 crc kubenswrapper[4739]: I1008 21:52:01.838014 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:01 crc kubenswrapper[4739]: I1008 21:52:01.838385 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:01 crc kubenswrapper[4739]: I1008 21:52:01.880501 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.211516 4739 generic.go:334] "Generic (PLEG): container finished" podID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerID="34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c" exitCode=0 Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.211622 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerDied","Data":"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c"} Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.214982 4739 generic.go:334] "Generic (PLEG): container finished" podID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerID="140d65d1baa23de5294cd0714893706b98b9e774559fe3a039860cf6b1085972" exitCode=0 Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.215061 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerDied","Data":"140d65d1baa23de5294cd0714893706b98b9e774559fe3a039860cf6b1085972"} Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.218518 4739 generic.go:334] "Generic (PLEG): container finished" podID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerID="7dd972ba52238fd4767461789b7850de61261dfa34f682f32103c396bd6c7c3c" exitCode=0 Oct 08 21:52:02 crc kubenswrapper[4739]: I1008 21:52:02.219550 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerDied","Data":"7dd972ba52238fd4767461789b7850de61261dfa34f682f32103c396bd6c7c3c"} Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.224514 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerStarted","Data":"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db"} Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.227112 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerStarted","Data":"eb0223025c9c58156ce938ddabf3faae8bb338e002cbc2fb553e5cc3152e4165"} Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.229228 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerStarted","Data":"c29b009f1ba4fd2041a35175cd50be9e52d82dd03fd723d0a2b776c7f8770d1e"} Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.243611 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnwlb" podStartSLOduration=2.208996559 podStartE2EDuration="1m4.243595047s" podCreationTimestamp="2025-10-08 21:50:59 +0000 UTC" firstStartedPulling="2025-10-08 21:51:00.664231856 +0000 UTC m=+160.489617606" lastFinishedPulling="2025-10-08 21:52:02.698830344 +0000 UTC m=+222.524216094" observedRunningTime="2025-10-08 21:52:03.240495655 +0000 UTC m=+223.065881405" watchObservedRunningTime="2025-10-08 21:52:03.243595047 +0000 UTC m=+223.068980797" Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.280824 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sjq82" podStartSLOduration=3.278526696 podStartE2EDuration="1m2.280805209s" podCreationTimestamp="2025-10-08 21:51:01 +0000 UTC" firstStartedPulling="2025-10-08 21:51:03.757053418 +0000 UTC m=+163.582439168" lastFinishedPulling="2025-10-08 21:52:02.759331931 +0000 UTC m=+222.584717681" observedRunningTime="2025-10-08 21:52:03.278081929 +0000 UTC m=+223.103467679" watchObservedRunningTime="2025-10-08 21:52:03.280805209 +0000 UTC m=+223.106190959" Oct 08 21:52:03 crc kubenswrapper[4739]: I1008 21:52:03.282758 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xcwbd" podStartSLOduration=2.178330014 podStartE2EDuration="1m1.282750107s" podCreationTimestamp="2025-10-08 21:51:02 +0000 UTC" firstStartedPulling="2025-10-08 21:51:03.759097038 +0000 UTC m=+163.584482788" lastFinishedPulling="2025-10-08 21:52:02.863517131 +0000 UTC m=+222.688902881" observedRunningTime="2025-10-08 21:52:03.265342355 +0000 UTC m=+223.090728105" watchObservedRunningTime="2025-10-08 21:52:03.282750107 +0000 UTC m=+223.108135857" Oct 08 21:52:05 crc kubenswrapper[4739]: I1008 21:52:05.240726 4739 generic.go:334] "Generic (PLEG): container finished" podID="c3158bce-349e-4143-93e7-42fd8f486e65" containerID="6700e96069dc7d4c772bcfd98783c3a913ce58b2e192acca4e920138a83dc0df" exitCode=0 Oct 08 21:52:05 crc kubenswrapper[4739]: I1008 21:52:05.240802 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerDied","Data":"6700e96069dc7d4c772bcfd98783c3a913ce58b2e192acca4e920138a83dc0df"} Oct 08 21:52:06 crc kubenswrapper[4739]: I1008 21:52:06.254894 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerStarted","Data":"8520c7006ca61ec65f09811d18ed95676d04993057b8e41560df307f0686e83e"} Oct 08 21:52:06 crc kubenswrapper[4739]: I1008 21:52:06.257249 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerStarted","Data":"74ba6ed253dc06e294eb1f8b8713b3f61237a10dadabebbcaa64f7f635eff1bd"} Oct 08 21:52:06 crc kubenswrapper[4739]: I1008 21:52:06.288063 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vx9b" podStartSLOduration=2.305214253 podStartE2EDuration="1m7.288045509s" podCreationTimestamp="2025-10-08 21:50:59 +0000 UTC" firstStartedPulling="2025-10-08 21:51:00.667112148 +0000 UTC m=+160.492497898" lastFinishedPulling="2025-10-08 21:52:05.649943404 +0000 UTC m=+225.475329154" observedRunningTime="2025-10-08 21:52:06.287006899 +0000 UTC m=+226.112392649" watchObservedRunningTime="2025-10-08 21:52:06.288045509 +0000 UTC m=+226.113431259" Oct 08 21:52:07 crc kubenswrapper[4739]: I1008 21:52:07.263160 4739 generic.go:334] "Generic (PLEG): container finished" podID="d5d5effe-9f0b-4823-9365-e07862452e39" containerID="8520c7006ca61ec65f09811d18ed95676d04993057b8e41560df307f0686e83e" exitCode=0 Oct 08 21:52:07 crc kubenswrapper[4739]: I1008 21:52:07.263205 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerDied","Data":"8520c7006ca61ec65f09811d18ed95676d04993057b8e41560df307f0686e83e"} Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.274438 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerStarted","Data":"0eeb15c1b9674b7c1ddf972848d8f69b1848a017e027ac4abf55d4d48fb8542d"} Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.294977 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqt7t" podStartSLOduration=2.176115074 podStartE2EDuration="1m7.29496039s" podCreationTimestamp="2025-10-08 21:51:02 +0000 UTC" firstStartedPulling="2025-10-08 21:51:03.769239602 +0000 UTC m=+163.594625352" lastFinishedPulling="2025-10-08 21:52:08.888084928 +0000 UTC m=+228.713470668" observedRunningTime="2025-10-08 21:52:09.290907621 +0000 UTC m=+229.116293391" watchObservedRunningTime="2025-10-08 21:52:09.29496039 +0000 UTC m=+229.120346140" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.462849 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.462913 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.503886 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.698271 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.869283 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.869325 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:09 crc kubenswrapper[4739]: I1008 21:52:09.911744 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:10 crc kubenswrapper[4739]: I1008 21:52:10.316949 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:10 crc kubenswrapper[4739]: I1008 21:52:10.318187 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:52:11 crc kubenswrapper[4739]: I1008 21:52:11.286533 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerStarted","Data":"81daf90ab308e11f2098a2170deff8de3b2e80f4020c22ad181557e36f1c0c29"} Oct 08 21:52:11 crc kubenswrapper[4739]: I1008 21:52:11.388058 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:52:11 crc kubenswrapper[4739]: I1008 21:52:11.388117 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:52:11 crc kubenswrapper[4739]: I1008 21:52:11.428585 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:52:11 crc kubenswrapper[4739]: I1008 21:52:11.874843 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.048361 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.250315 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.250818 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7wh9v" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="registry-server" containerID="cri-o://2b34652737a090fa94dd0e43b4bca27c02d4e671060d13ff4342d0ed62f7ae40" gracePeriod=2 Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.293837 4739 generic.go:334] "Generic (PLEG): container finished" podID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerID="81daf90ab308e11f2098a2170deff8de3b2e80f4020c22ad181557e36f1c0c29" exitCode=0 Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.293958 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerDied","Data":"81daf90ab308e11f2098a2170deff8de3b2e80f4020c22ad181557e36f1c0c29"} Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.294273 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vx9b" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="registry-server" containerID="cri-o://74ba6ed253dc06e294eb1f8b8713b3f61237a10dadabebbcaa64f7f635eff1bd" gracePeriod=2 Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.331620 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.408203 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.408656 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.444666 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.807161 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:12 crc kubenswrapper[4739]: I1008 21:52:12.807213 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.300408 4739 generic.go:334] "Generic (PLEG): container finished" podID="c3158bce-349e-4143-93e7-42fd8f486e65" containerID="74ba6ed253dc06e294eb1f8b8713b3f61237a10dadabebbcaa64f7f635eff1bd" exitCode=0 Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.300485 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerDied","Data":"74ba6ed253dc06e294eb1f8b8713b3f61237a10dadabebbcaa64f7f635eff1bd"} Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.302737 4739 generic.go:334] "Generic (PLEG): container finished" podID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerID="2b34652737a090fa94dd0e43b4bca27c02d4e671060d13ff4342d0ed62f7ae40" exitCode=0 Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.302823 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerDied","Data":"2b34652737a090fa94dd0e43b4bca27c02d4e671060d13ff4342d0ed62f7ae40"} Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.339570 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:52:13 crc kubenswrapper[4739]: I1008 21:52:13.846023 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pqt7t" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="registry-server" probeResult="failure" output=< Oct 08 21:52:13 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 21:52:13 crc kubenswrapper[4739]: > Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.001596 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.006853 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.097747 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content\") pod \"c3158bce-349e-4143-93e7-42fd8f486e65\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.098321 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m68qp\" (UniqueName: \"kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp\") pod \"c3158bce-349e-4143-93e7-42fd8f486e65\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.098472 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98g5v\" (UniqueName: \"kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v\") pod \"e59564a7-a703-4b18-a5cc-f231f8a8232b\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.098581 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities\") pod \"e59564a7-a703-4b18-a5cc-f231f8a8232b\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.098657 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities\") pod \"c3158bce-349e-4143-93e7-42fd8f486e65\" (UID: \"c3158bce-349e-4143-93e7-42fd8f486e65\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.098783 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content\") pod \"e59564a7-a703-4b18-a5cc-f231f8a8232b\" (UID: \"e59564a7-a703-4b18-a5cc-f231f8a8232b\") " Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.100026 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities" (OuterVolumeSpecName: "utilities") pod "c3158bce-349e-4143-93e7-42fd8f486e65" (UID: "c3158bce-349e-4143-93e7-42fd8f486e65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.100173 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities" (OuterVolumeSpecName: "utilities") pod "e59564a7-a703-4b18-a5cc-f231f8a8232b" (UID: "e59564a7-a703-4b18-a5cc-f231f8a8232b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.104835 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v" (OuterVolumeSpecName: "kube-api-access-98g5v") pod "e59564a7-a703-4b18-a5cc-f231f8a8232b" (UID: "e59564a7-a703-4b18-a5cc-f231f8a8232b"). InnerVolumeSpecName "kube-api-access-98g5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.105414 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp" (OuterVolumeSpecName: "kube-api-access-m68qp") pod "c3158bce-349e-4143-93e7-42fd8f486e65" (UID: "c3158bce-349e-4143-93e7-42fd8f486e65"). InnerVolumeSpecName "kube-api-access-m68qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.146355 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e59564a7-a703-4b18-a5cc-f231f8a8232b" (UID: "e59564a7-a703-4b18-a5cc-f231f8a8232b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.156569 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3158bce-349e-4143-93e7-42fd8f486e65" (UID: "c3158bce-349e-4143-93e7-42fd8f486e65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199692 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m68qp\" (UniqueName: \"kubernetes.io/projected/c3158bce-349e-4143-93e7-42fd8f486e65-kube-api-access-m68qp\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199728 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98g5v\" (UniqueName: \"kubernetes.io/projected/e59564a7-a703-4b18-a5cc-f231f8a8232b-kube-api-access-98g5v\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199738 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199747 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199755 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e59564a7-a703-4b18-a5cc-f231f8a8232b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.199763 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3158bce-349e-4143-93e7-42fd8f486e65-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.308696 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wh9v" event={"ID":"e59564a7-a703-4b18-a5cc-f231f8a8232b","Type":"ContainerDied","Data":"45b9ca19918cf5164fe82ffd628a0516b2fd79cd47daf70197bc247cfc181fbd"} Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.308785 4739 scope.go:117] "RemoveContainer" containerID="2b34652737a090fa94dd0e43b4bca27c02d4e671060d13ff4342d0ed62f7ae40" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.308895 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wh9v" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.312796 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vx9b" event={"ID":"c3158bce-349e-4143-93e7-42fd8f486e65","Type":"ContainerDied","Data":"0fbf18683eed9e130a1f14e64919d637470401b5622195a15400f31fe64e7a33"} Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.312828 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vx9b" Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.342401 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.345229 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7wh9v"] Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.351571 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.353899 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vx9b"] Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.648585 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.648847 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67xr2" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="registry-server" containerID="cri-o://7837c75d4069a21f27c5a27f628702931331b2ea59648b88899dffd958dacde3" gracePeriod=2 Oct 08 21:52:14 crc kubenswrapper[4739]: I1008 21:52:14.986814 4739 scope.go:117] "RemoveContainer" containerID="ae2174c43171f0b7ef11c40c51ca972671db52c15df1feb49b2114104bf1a873" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.114378 4739 scope.go:117] "RemoveContainer" containerID="4563c57ffc125df447a465af3152b0cbb9e55a30787afaa200a55cd84d711d1b" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.320431 4739 generic.go:334] "Generic (PLEG): container finished" podID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerID="7837c75d4069a21f27c5a27f628702931331b2ea59648b88899dffd958dacde3" exitCode=0 Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.320553 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerDied","Data":"7837c75d4069a21f27c5a27f628702931331b2ea59648b88899dffd958dacde3"} Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.803062 4739 scope.go:117] "RemoveContainer" containerID="74ba6ed253dc06e294eb1f8b8713b3f61237a10dadabebbcaa64f7f635eff1bd" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.827951 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" path="/var/lib/kubelet/pods/c3158bce-349e-4143-93e7-42fd8f486e65/volumes" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.828764 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" path="/var/lib/kubelet/pods/e59564a7-a703-4b18-a5cc-f231f8a8232b/volumes" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.851409 4739 scope.go:117] "RemoveContainer" containerID="6700e96069dc7d4c772bcfd98783c3a913ce58b2e192acca4e920138a83dc0df" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.853581 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.898932 4739 scope.go:117] "RemoveContainer" containerID="3f3e80b9e2930d025839974d836568c1a2303b56ad0826e0ab438b445b125346" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.917650 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities\") pod \"04f484a1-1bcd-4168-bad7-3901c03a49e0\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.918010 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content\") pod \"04f484a1-1bcd-4168-bad7-3901c03a49e0\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.918046 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvzgz\" (UniqueName: \"kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz\") pod \"04f484a1-1bcd-4168-bad7-3901c03a49e0\" (UID: \"04f484a1-1bcd-4168-bad7-3901c03a49e0\") " Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.918668 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities" (OuterVolumeSpecName: "utilities") pod "04f484a1-1bcd-4168-bad7-3901c03a49e0" (UID: "04f484a1-1bcd-4168-bad7-3901c03a49e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.925277 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz" (OuterVolumeSpecName: "kube-api-access-kvzgz") pod "04f484a1-1bcd-4168-bad7-3901c03a49e0" (UID: "04f484a1-1bcd-4168-bad7-3901c03a49e0"). InnerVolumeSpecName "kube-api-access-kvzgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:15 crc kubenswrapper[4739]: I1008 21:52:15.938525 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04f484a1-1bcd-4168-bad7-3901c03a49e0" (UID: "04f484a1-1bcd-4168-bad7-3901c03a49e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.019921 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.019960 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f484a1-1bcd-4168-bad7-3901c03a49e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.019971 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvzgz\" (UniqueName: \"kubernetes.io/projected/04f484a1-1bcd-4168-bad7-3901c03a49e0-kube-api-access-kvzgz\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.329886 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67xr2" event={"ID":"04f484a1-1bcd-4168-bad7-3901c03a49e0","Type":"ContainerDied","Data":"bccb0ff354a8b3234fa417e9a1e267d410a4c1b3a58400880278c7f7ef4564e2"} Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.329914 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67xr2" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.329943 4739 scope.go:117] "RemoveContainer" containerID="7837c75d4069a21f27c5a27f628702931331b2ea59648b88899dffd958dacde3" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.344264 4739 scope.go:117] "RemoveContainer" containerID="0e4c8176b1ce15c744b4314e73dbbb3fe4c45c7962b63b640235d1a5ebd8eb9e" Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.355211 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.358346 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67xr2"] Oct 08 21:52:16 crc kubenswrapper[4739]: I1008 21:52:16.374745 4739 scope.go:117] "RemoveContainer" containerID="73c1a39610c4e1e203b2cb69f9282fdc4af1c67867d639630705b636d0786232" Oct 08 21:52:17 crc kubenswrapper[4739]: I1008 21:52:17.337995 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerStarted","Data":"58e0915cc217848b9a6d3f5c0f841a7edf4c0f822ad4c3f14867b0d5251c2fea"} Oct 08 21:52:17 crc kubenswrapper[4739]: I1008 21:52:17.354825 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-867zz" podStartSLOduration=4.219901275 podStartE2EDuration="1m19.354809233s" podCreationTimestamp="2025-10-08 21:50:58 +0000 UTC" firstStartedPulling="2025-10-08 21:51:00.671058587 +0000 UTC m=+160.496444337" lastFinishedPulling="2025-10-08 21:52:15.805966545 +0000 UTC m=+235.631352295" observedRunningTime="2025-10-08 21:52:17.353393171 +0000 UTC m=+237.178778921" watchObservedRunningTime="2025-10-08 21:52:17.354809233 +0000 UTC m=+237.180194983" Oct 08 21:52:17 crc kubenswrapper[4739]: I1008 21:52:17.828497 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" path="/var/lib/kubelet/pods/04f484a1-1bcd-4168-bad7-3901c03a49e0/volumes" Oct 08 21:52:19 crc kubenswrapper[4739]: I1008 21:52:19.242704 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:52:19 crc kubenswrapper[4739]: I1008 21:52:19.242769 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:52:19 crc kubenswrapper[4739]: I1008 21:52:19.291044 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:52:22 crc kubenswrapper[4739]: I1008 21:52:22.846055 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:22 crc kubenswrapper[4739]: I1008 21:52:22.885349 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.050631 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.051433 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqt7t" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="registry-server" containerID="cri-o://0eeb15c1b9674b7c1ddf972848d8f69b1848a017e027ac4abf55d4d48fb8542d" gracePeriod=2 Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.393969 4739 generic.go:334] "Generic (PLEG): container finished" podID="d5d5effe-9f0b-4823-9365-e07862452e39" containerID="0eeb15c1b9674b7c1ddf972848d8f69b1848a017e027ac4abf55d4d48fb8542d" exitCode=0 Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.394008 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerDied","Data":"0eeb15c1b9674b7c1ddf972848d8f69b1848a017e027ac4abf55d4d48fb8542d"} Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.394033 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqt7t" event={"ID":"d5d5effe-9f0b-4823-9365-e07862452e39","Type":"ContainerDied","Data":"60e04a440919c51952f726aea48cbbc06ed52356c6ce4efc83e441c97bf6e6cc"} Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.394044 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60e04a440919c51952f726aea48cbbc06ed52356c6ce4efc83e441c97bf6e6cc" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.419471 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.603093 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr47m\" (UniqueName: \"kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m\") pod \"d5d5effe-9f0b-4823-9365-e07862452e39\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.603173 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities\") pod \"d5d5effe-9f0b-4823-9365-e07862452e39\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.603206 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content\") pod \"d5d5effe-9f0b-4823-9365-e07862452e39\" (UID: \"d5d5effe-9f0b-4823-9365-e07862452e39\") " Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.604677 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities" (OuterVolumeSpecName: "utilities") pod "d5d5effe-9f0b-4823-9365-e07862452e39" (UID: "d5d5effe-9f0b-4823-9365-e07862452e39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.611269 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m" (OuterVolumeSpecName: "kube-api-access-jr47m") pod "d5d5effe-9f0b-4823-9365-e07862452e39" (UID: "d5d5effe-9f0b-4823-9365-e07862452e39"). InnerVolumeSpecName "kube-api-access-jr47m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.705339 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr47m\" (UniqueName: \"kubernetes.io/projected/d5d5effe-9f0b-4823-9365-e07862452e39-kube-api-access-jr47m\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.705420 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.707622 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5d5effe-9f0b-4823-9365-e07862452e39" (UID: "d5d5effe-9f0b-4823-9365-e07862452e39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:52:27 crc kubenswrapper[4739]: I1008 21:52:27.806993 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d5effe-9f0b-4823-9365-e07862452e39-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:52:28 crc kubenswrapper[4739]: I1008 21:52:28.398188 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqt7t" Oct 08 21:52:28 crc kubenswrapper[4739]: I1008 21:52:28.415064 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:52:28 crc kubenswrapper[4739]: I1008 21:52:28.418285 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqt7t"] Oct 08 21:52:29 crc kubenswrapper[4739]: I1008 21:52:29.301317 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:52:29 crc kubenswrapper[4739]: I1008 21:52:29.827317 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" path="/var/lib/kubelet/pods/d5d5effe-9f0b-4823-9365-e07862452e39/volumes" Oct 08 21:52:42 crc kubenswrapper[4739]: I1008 21:52:42.378495 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:53:07 crc kubenswrapper[4739]: I1008 21:53:07.415224 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerName="oauth-openshift" containerID="cri-o://6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705" gracePeriod=15 Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.342729 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386630 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z"] Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386817 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386829 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386839 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386845 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386857 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6bb85b-452f-4504-8859-efd8cd222178" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386864 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6bb85b-452f-4504-8859-efd8cd222178" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386870 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386875 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386883 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386890 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386900 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386906 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386915 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386920 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386928 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfbbfca-3095-4a7b-869e-70b1a86046c4" containerName="collect-profiles" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386933 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfbbfca-3095-4a7b-869e-70b1a86046c4" containerName="collect-profiles" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386941 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386948 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386959 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386965 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386972 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a935283-96a7-4954-85be-1b1aecb2d14c" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386977 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a935283-96a7-4954-85be-1b1aecb2d14c" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386984 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.386990 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.386997 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerName="oauth-openshift" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387002 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerName="oauth-openshift" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.387009 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387014 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.387022 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387028 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="extract-utilities" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.387037 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387042 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="extract-content" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387122 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfbbfca-3095-4a7b-869e-70b1a86046c4" containerName="collect-profiles" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387130 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59564a7-a703-4b18-a5cc-f231f8a8232b" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387157 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6bb85b-452f-4504-8859-efd8cd222178" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387165 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f484a1-1bcd-4168-bad7-3901c03a49e0" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387172 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d5effe-9f0b-4823-9365-e07862452e39" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387181 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerName="oauth-openshift" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387190 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3158bce-349e-4143-93e7-42fd8f486e65" containerName="registry-server" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387200 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a935283-96a7-4954-85be-1b1aecb2d14c" containerName="pruner" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.387520 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.391409 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z"] Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443279 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443329 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443346 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443382 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443404 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltj62\" (UniqueName: \"kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443424 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443443 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443460 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443480 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443500 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443517 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443547 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443585 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443608 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection\") pod \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\" (UID: \"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b\") " Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.443717 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444511 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444557 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444538 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444604 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444602 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444644 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833bc792-970c-4def-90ca-3182752ae64e-audit-dir\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444704 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444755 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444823 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.444971 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445021 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445053 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445114 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445231 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445297 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445358 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-audit-policies\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445420 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vwc\" (UniqueName: \"kubernetes.io/projected/833bc792-970c-4def-90ca-3182752ae64e-kube-api-access-29vwc\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445450 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445574 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445606 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445622 4739 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445636 4739 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.445648 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.449094 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.449413 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.449942 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.450417 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62" (OuterVolumeSpecName: "kube-api-access-ltj62") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "kube-api-access-ltj62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.450427 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.452577 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.452832 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.453303 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.460441 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" (UID: "7fc5c94d-b325-4b06-bbce-ed6f0792fc7b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547505 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547666 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-audit-policies\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547758 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vwc\" (UniqueName: \"kubernetes.io/projected/833bc792-970c-4def-90ca-3182752ae64e-kube-api-access-29vwc\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547801 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547864 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547901 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547936 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833bc792-970c-4def-90ca-3182752ae64e-audit-dir\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.547975 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548017 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548024 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/833bc792-970c-4def-90ca-3182752ae64e-audit-dir\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548053 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548097 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548130 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548194 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548341 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548416 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548440 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548445 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548460 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548489 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-audit-policies\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548839 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-service-ca\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548498 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548882 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548897 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548911 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltj62\" (UniqueName: \"kubernetes.io/projected/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-kube-api-access-ltj62\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548923 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.548937 4739 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.549451 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.552693 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.552742 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.552738 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-router-certs\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.552911 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.553115 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-session\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.554443 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.559444 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-login\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.559695 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/833bc792-970c-4def-90ca-3182752ae64e-v4-0-config-user-template-error\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.569100 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vwc\" (UniqueName: \"kubernetes.io/projected/833bc792-970c-4def-90ca-3182752ae64e-kube-api-access-29vwc\") pod \"oauth-openshift-764f9b7cd5-gmc2z\" (UID: \"833bc792-970c-4def-90ca-3182752ae64e\") " pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.602272 4739 generic.go:334] "Generic (PLEG): container finished" podID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" containerID="6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705" exitCode=0 Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.602335 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" event={"ID":"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b","Type":"ContainerDied","Data":"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705"} Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.602370 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.602383 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7fzvv" event={"ID":"7fc5c94d-b325-4b06-bbce-ed6f0792fc7b","Type":"ContainerDied","Data":"a9a48406ac5526bc9d6bbabe675fbaa0ac6f953a6d7b594e9711eb376ddd6fd1"} Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.602411 4739 scope.go:117] "RemoveContainer" containerID="6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.619597 4739 scope.go:117] "RemoveContainer" containerID="6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705" Oct 08 21:53:08 crc kubenswrapper[4739]: E1008 21:53:08.620156 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705\": container with ID starting with 6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705 not found: ID does not exist" containerID="6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.620269 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705"} err="failed to get container status \"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705\": rpc error: code = NotFound desc = could not find container \"6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705\": container with ID starting with 6c6bb0b07e755ed4f54382b9ae9c8a591b61a033a4206ab3ffd2585c23338705 not found: ID does not exist" Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.641242 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.645937 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7fzvv"] Oct 08 21:53:08 crc kubenswrapper[4739]: I1008 21:53:08.705007 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.179273 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z"] Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.608475 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" event={"ID":"833bc792-970c-4def-90ca-3182752ae64e","Type":"ContainerStarted","Data":"c1ea229f47a0888e0aeac4369b4a847f9d9e4b7c73ce54233c68d9b9c00eee44"} Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.608739 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.608748 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" event={"ID":"833bc792-970c-4def-90ca-3182752ae64e","Type":"ContainerStarted","Data":"7e6752779c791a63c4f617c26d8d30992076f55fb413d2b7839e64d72ee244bd"} Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.634959 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" podStartSLOduration=27.634939546 podStartE2EDuration="27.634939546s" podCreationTimestamp="2025-10-08 21:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:53:09.632050672 +0000 UTC m=+289.457436462" watchObservedRunningTime="2025-10-08 21:53:09.634939546 +0000 UTC m=+289.460325286" Oct 08 21:53:09 crc kubenswrapper[4739]: I1008 21:53:09.828819 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc5c94d-b325-4b06-bbce-ed6f0792fc7b" path="/var/lib/kubelet/pods/7fc5c94d-b325-4b06-bbce-ed6f0792fc7b/volumes" Oct 08 21:53:10 crc kubenswrapper[4739]: I1008 21:53:10.033788 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-764f9b7cd5-gmc2z" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.461001 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.461967 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-867zz" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="registry-server" containerID="cri-o://58e0915cc217848b9a6d3f5c0f841a7edf4c0f822ad4c3f14867b0d5251c2fea" gracePeriod=30 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.474374 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.474981 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnwlb" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="registry-server" containerID="cri-o://d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db" gracePeriod=30 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.483671 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.484052 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" containerID="cri-o://f814ffedd05fab2b0a3f0c07cf59108608700a3fabc88a1dd68f3f7e1d644c6c" gracePeriod=30 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.495806 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.496254 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sjq82" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="registry-server" containerID="cri-o://eb0223025c9c58156ce938ddabf3faae8bb338e002cbc2fb553e5cc3152e4165" gracePeriod=30 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.501265 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.501844 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xcwbd" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="registry-server" containerID="cri-o://c29b009f1ba4fd2041a35175cd50be9e52d82dd03fd723d0a2b776c7f8770d1e" gracePeriod=30 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.507410 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x54q2"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.508390 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.512452 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x54q2"] Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.632178 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.632489 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.632617 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfjgc\" (UniqueName: \"kubernetes.io/projected/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-kube-api-access-sfjgc\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.690668 4739 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qpcfx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.690889 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.733827 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfjgc\" (UniqueName: \"kubernetes.io/projected/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-kube-api-access-sfjgc\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.735741 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.736139 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.740577 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.742547 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.753428 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfjgc\" (UniqueName: \"kubernetes.io/projected/6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669-kube-api-access-sfjgc\") pod \"marketplace-operator-79b997595-x54q2\" (UID: \"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669\") " pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.817221 4739 generic.go:334] "Generic (PLEG): container finished" podID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerID="eb0223025c9c58156ce938ddabf3faae8bb338e002cbc2fb553e5cc3152e4165" exitCode=0 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.817288 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerDied","Data":"eb0223025c9c58156ce938ddabf3faae8bb338e002cbc2fb553e5cc3152e4165"} Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.820021 4739 generic.go:334] "Generic (PLEG): container finished" podID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerID="58e0915cc217848b9a6d3f5c0f841a7edf4c0f822ad4c3f14867b0d5251c2fea" exitCode=0 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.820155 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerDied","Data":"58e0915cc217848b9a6d3f5c0f841a7edf4c0f822ad4c3f14867b0d5251c2fea"} Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.823848 4739 generic.go:334] "Generic (PLEG): container finished" podID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerID="c29b009f1ba4fd2041a35175cd50be9e52d82dd03fd723d0a2b776c7f8770d1e" exitCode=0 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.823952 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerDied","Data":"c29b009f1ba4fd2041a35175cd50be9e52d82dd03fd723d0a2b776c7f8770d1e"} Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.825459 4739 generic.go:334] "Generic (PLEG): container finished" podID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerID="f814ffedd05fab2b0a3f0c07cf59108608700a3fabc88a1dd68f3f7e1d644c6c" exitCode=0 Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.825500 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" event={"ID":"34b38b7a-4e93-49f1-907e-24fc371f31e3","Type":"ContainerDied","Data":"f814ffedd05fab2b0a3f0c07cf59108608700a3fabc88a1dd68f3f7e1d644c6c"} Oct 08 21:53:42 crc kubenswrapper[4739]: I1008 21:53:42.842444 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.018607 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x54q2"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.392494 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.453596 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.460738 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.520479 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.547021 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6twt4\" (UniqueName: \"kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4\") pod \"e75f5e41-357c-47c7-b000-1103fd2b9756\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.547097 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content\") pod \"e75f5e41-357c-47c7-b000-1103fd2b9756\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.547181 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities\") pod \"e75f5e41-357c-47c7-b000-1103fd2b9756\" (UID: \"e75f5e41-357c-47c7-b000-1103fd2b9756\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.548855 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities" (OuterVolumeSpecName: "utilities") pod "e75f5e41-357c-47c7-b000-1103fd2b9756" (UID: "e75f5e41-357c-47c7-b000-1103fd2b9756"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.552250 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4" (OuterVolumeSpecName: "kube-api-access-6twt4") pod "e75f5e41-357c-47c7-b000-1103fd2b9756" (UID: "e75f5e41-357c-47c7-b000-1103fd2b9756"). InnerVolumeSpecName "kube-api-access-6twt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.594239 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e75f5e41-357c-47c7-b000-1103fd2b9756" (UID: "e75f5e41-357c-47c7-b000-1103fd2b9756"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.647775 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwzb6\" (UniqueName: \"kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6\") pod \"34b38b7a-4e93-49f1-907e-24fc371f31e3\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.647833 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities\") pod \"13bcd021-6f45-4843-9255-92ee1ad4e031\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.647918 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content\") pod \"13bcd021-6f45-4843-9255-92ee1ad4e031\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.647954 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5gb9\" (UniqueName: \"kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9\") pod \"deed15fa-28ce-4057-a1d7-5f920fa4751b\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648016 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwhsj\" (UniqueName: \"kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj\") pod \"13bcd021-6f45-4843-9255-92ee1ad4e031\" (UID: \"13bcd021-6f45-4843-9255-92ee1ad4e031\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648046 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics\") pod \"34b38b7a-4e93-49f1-907e-24fc371f31e3\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648072 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca\") pod \"34b38b7a-4e93-49f1-907e-24fc371f31e3\" (UID: \"34b38b7a-4e93-49f1-907e-24fc371f31e3\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648095 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities\") pod \"deed15fa-28ce-4057-a1d7-5f920fa4751b\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648758 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "34b38b7a-4e93-49f1-907e-24fc371f31e3" (UID: "34b38b7a-4e93-49f1-907e-24fc371f31e3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.648776 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content\") pod \"deed15fa-28ce-4057-a1d7-5f920fa4751b\" (UID: \"deed15fa-28ce-4057-a1d7-5f920fa4751b\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.649411 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.649429 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6twt4\" (UniqueName: \"kubernetes.io/projected/e75f5e41-357c-47c7-b000-1103fd2b9756-kube-api-access-6twt4\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.649440 4739 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.649451 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75f5e41-357c-47c7-b000-1103fd2b9756-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.649735 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities" (OuterVolumeSpecName: "utilities") pod "13bcd021-6f45-4843-9255-92ee1ad4e031" (UID: "13bcd021-6f45-4843-9255-92ee1ad4e031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.650525 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.652521 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities" (OuterVolumeSpecName: "utilities") pod "deed15fa-28ce-4057-a1d7-5f920fa4751b" (UID: "deed15fa-28ce-4057-a1d7-5f920fa4751b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.652774 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6" (OuterVolumeSpecName: "kube-api-access-xwzb6") pod "34b38b7a-4e93-49f1-907e-24fc371f31e3" (UID: "34b38b7a-4e93-49f1-907e-24fc371f31e3"). InnerVolumeSpecName "kube-api-access-xwzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.653205 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj" (OuterVolumeSpecName: "kube-api-access-bwhsj") pod "13bcd021-6f45-4843-9255-92ee1ad4e031" (UID: "13bcd021-6f45-4843-9255-92ee1ad4e031"). InnerVolumeSpecName "kube-api-access-bwhsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.653505 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "34b38b7a-4e93-49f1-907e-24fc371f31e3" (UID: "34b38b7a-4e93-49f1-907e-24fc371f31e3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.653911 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9" (OuterVolumeSpecName: "kube-api-access-k5gb9") pod "deed15fa-28ce-4057-a1d7-5f920fa4751b" (UID: "deed15fa-28ce-4057-a1d7-5f920fa4751b"). InnerVolumeSpecName "kube-api-access-k5gb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.667095 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deed15fa-28ce-4057-a1d7-5f920fa4751b" (UID: "deed15fa-28ce-4057-a1d7-5f920fa4751b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.748313 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13bcd021-6f45-4843-9255-92ee1ad4e031" (UID: "13bcd021-6f45-4843-9255-92ee1ad4e031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751688 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5gb9\" (UniqueName: \"kubernetes.io/projected/deed15fa-28ce-4057-a1d7-5f920fa4751b-kube-api-access-k5gb9\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751743 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwhsj\" (UniqueName: \"kubernetes.io/projected/13bcd021-6f45-4843-9255-92ee1ad4e031-kube-api-access-bwhsj\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751763 4739 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34b38b7a-4e93-49f1-907e-24fc371f31e3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751780 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751800 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deed15fa-28ce-4057-a1d7-5f920fa4751b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751815 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwzb6\" (UniqueName: \"kubernetes.io/projected/34b38b7a-4e93-49f1-907e-24fc371f31e3-kube-api-access-xwzb6\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751832 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.751847 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13bcd021-6f45-4843-9255-92ee1ad4e031-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.833355 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjq82" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.838166 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-867zz" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839634 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" event={"ID":"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669","Type":"ContainerStarted","Data":"d8416b14c3271b15f8784f1a03e1c2dae105d70d8bf3619fbce502dd576d895f"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839671 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" event={"ID":"6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669","Type":"ContainerStarted","Data":"6e0d3286d6ec437c75c445e4bbc6b2255df3a68d0901ed85d327f8efcfe4231e"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839681 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjq82" event={"ID":"deed15fa-28ce-4057-a1d7-5f920fa4751b","Type":"ContainerDied","Data":"226ea1519865dc1be152e53037f42584ce1d3a2a236a8d147c4abcaa957d091c"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839722 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839736 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-867zz" event={"ID":"e75f5e41-357c-47c7-b000-1103fd2b9756","Type":"ContainerDied","Data":"6cf51f85c9f25f4c7acdb032e5ba92562efb8dcb697c40c2ea0225146a0900fc"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.839757 4739 scope.go:117] "RemoveContainer" containerID="eb0223025c9c58156ce938ddabf3faae8bb338e002cbc2fb553e5cc3152e4165" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.841592 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcwbd" event={"ID":"13bcd021-6f45-4843-9255-92ee1ad4e031","Type":"ContainerDied","Data":"eec4a15ad77f8ec425bac82a39a64217fc2b248b3c773ec02d0431d1ed0d8e76"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.841673 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcwbd" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.844549 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.848614 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" event={"ID":"34b38b7a-4e93-49f1-907e-24fc371f31e3","Type":"ContainerDied","Data":"a9b65154ff43aee50ad40a1d63adf2dd0af8e79c608e28337502eae70d346afc"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.848710 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qpcfx" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.853928 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities\") pod \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.854044 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bndgf\" (UniqueName: \"kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf\") pod \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.854380 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content\") pod \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\" (UID: \"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834\") " Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.854746 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities" (OuterVolumeSpecName: "utilities") pod "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" (UID: "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.855064 4739 scope.go:117] "RemoveContainer" containerID="140d65d1baa23de5294cd0714893706b98b9e774559fe3a039860cf6b1085972" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.855617 4739 generic.go:334] "Generic (PLEG): container finished" podID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerID="d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db" exitCode=0 Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.855662 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerDied","Data":"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.855688 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnwlb" event={"ID":"5ac894a1-e4ba-4a6b-8ac2-3a24ef886834","Type":"ContainerDied","Data":"f18b27585b4699f1ea0a9d2bf803715b4bb5ba4fa40f8feb9c5034c8bd045a79"} Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.855888 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnwlb" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.880952 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf" (OuterVolumeSpecName: "kube-api-access-bndgf") pod "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" (UID: "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834"). InnerVolumeSpecName "kube-api-access-bndgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.886745 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-x54q2" podStartSLOduration=1.8867163009999999 podStartE2EDuration="1.886716301s" podCreationTimestamp="2025-10-08 21:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:53:43.8535372 +0000 UTC m=+323.678922950" watchObservedRunningTime="2025-10-08 21:53:43.886716301 +0000 UTC m=+323.712102051" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.899828 4739 scope.go:117] "RemoveContainer" containerID="4471660f8cca02b31a670d596bd2e3275eb5dbdac0e806d97548ec7fd6755e55" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.918377 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.919688 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjq82"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.931247 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.933701 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xcwbd"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.942084 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.946565 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-867zz"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.947551 4739 scope.go:117] "RemoveContainer" containerID="58e0915cc217848b9a6d3f5c0f841a7edf4c0f822ad4c3f14867b0d5251c2fea" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.950974 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" (UID: "5ac894a1-e4ba-4a6b-8ac2-3a24ef886834"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.955942 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bndgf\" (UniqueName: \"kubernetes.io/projected/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-kube-api-access-bndgf\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.955976 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.955987 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.957744 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.960117 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qpcfx"] Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.963414 4739 scope.go:117] "RemoveContainer" containerID="81daf90ab308e11f2098a2170deff8de3b2e80f4020c22ad181557e36f1c0c29" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.976341 4739 scope.go:117] "RemoveContainer" containerID="5b23ffdeed6fd2d1472b5e8ce2ba74a283f94caf2ecbe03ccdb72d816dceb75c" Oct 08 21:53:43 crc kubenswrapper[4739]: I1008 21:53:43.991595 4739 scope.go:117] "RemoveContainer" containerID="c29b009f1ba4fd2041a35175cd50be9e52d82dd03fd723d0a2b776c7f8770d1e" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.004488 4739 scope.go:117] "RemoveContainer" containerID="7dd972ba52238fd4767461789b7850de61261dfa34f682f32103c396bd6c7c3c" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.017476 4739 scope.go:117] "RemoveContainer" containerID="bb781cf719e79b4dda0d742b91c634111e7fd1cbd74e9b8a07281b637c18b937" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.032542 4739 scope.go:117] "RemoveContainer" containerID="f814ffedd05fab2b0a3f0c07cf59108608700a3fabc88a1dd68f3f7e1d644c6c" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.046804 4739 scope.go:117] "RemoveContainer" containerID="d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.061434 4739 scope.go:117] "RemoveContainer" containerID="34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.075026 4739 scope.go:117] "RemoveContainer" containerID="d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.088467 4739 scope.go:117] "RemoveContainer" containerID="d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.089003 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db\": container with ID starting with d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db not found: ID does not exist" containerID="d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.089085 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db"} err="failed to get container status \"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db\": rpc error: code = NotFound desc = could not find container \"d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db\": container with ID starting with d8987c5cbb35ecf8ea57f8e7dcb040306f4f33892e952927d56468f7cf50e8db not found: ID does not exist" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.089159 4739 scope.go:117] "RemoveContainer" containerID="34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.089719 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c\": container with ID starting with 34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c not found: ID does not exist" containerID="34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.089763 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c"} err="failed to get container status \"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c\": rpc error: code = NotFound desc = could not find container \"34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c\": container with ID starting with 34bd002005991dd1b40ecf8736e89c5401d4c2fd1cca6a22d5639ac6cebc808c not found: ID does not exist" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.089801 4739 scope.go:117] "RemoveContainer" containerID="d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.090097 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5\": container with ID starting with d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5 not found: ID does not exist" containerID="d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.090127 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5"} err="failed to get container status \"d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5\": rpc error: code = NotFound desc = could not find container \"d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5\": container with ID starting with d163334af05a06761b41087d0b32e78d8fb3b8d49e562eeb43792ec03897efd5 not found: ID does not exist" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.188788 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.191992 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnwlb"] Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667667 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbrj8"] Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667888 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667903 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667913 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667920 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667934 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667943 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667953 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667960 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667973 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667980 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.667992 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.667998 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668008 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668014 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668023 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668030 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668038 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668045 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668054 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668063 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="extract-content" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668073 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668082 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668092 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668098 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="extract-utilities" Oct 08 21:53:44 crc kubenswrapper[4739]: E1008 21:53:44.668108 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668115 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668243 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668260 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668271 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" containerName="marketplace-operator" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668280 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.668289 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" containerName="registry-server" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.669105 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.673018 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.681392 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbrj8"] Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.859825 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jhjjp"] Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.867094 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-catalog-content\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.867349 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-utilities\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.867492 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74wd\" (UniqueName: \"kubernetes.io/projected/0921b50a-3ca2-4f07-a060-63d6078eac48-kube-api-access-r74wd\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.906916 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.909240 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.910985 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhjjp"] Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.968435 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-utilities\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.968504 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74wd\" (UniqueName: \"kubernetes.io/projected/0921b50a-3ca2-4f07-a060-63d6078eac48-kube-api-access-r74wd\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.968534 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-catalog-content\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.968962 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-utilities\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.968981 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0921b50a-3ca2-4f07-a060-63d6078eac48-catalog-content\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.984866 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74wd\" (UniqueName: \"kubernetes.io/projected/0921b50a-3ca2-4f07-a060-63d6078eac48-kube-api-access-r74wd\") pod \"redhat-marketplace-lbrj8\" (UID: \"0921b50a-3ca2-4f07-a060-63d6078eac48\") " pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:44 crc kubenswrapper[4739]: I1008 21:53:44.994872 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.069081 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-catalog-content\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.069245 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wx88\" (UniqueName: \"kubernetes.io/projected/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-kube-api-access-5wx88\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.069319 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-utilities\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.170111 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-utilities\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.170205 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-catalog-content\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.170236 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wx88\" (UniqueName: \"kubernetes.io/projected/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-kube-api-access-5wx88\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.170844 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-utilities\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.170932 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-catalog-content\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.172160 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbrj8"] Oct 08 21:53:45 crc kubenswrapper[4739]: W1008 21:53:45.180832 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0921b50a_3ca2_4f07_a060_63d6078eac48.slice/crio-aa33f20535ad197a843989f871ea25af1427a14353c9ba0226864357e43b4688 WatchSource:0}: Error finding container aa33f20535ad197a843989f871ea25af1427a14353c9ba0226864357e43b4688: Status 404 returned error can't find the container with id aa33f20535ad197a843989f871ea25af1427a14353c9ba0226864357e43b4688 Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.192825 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wx88\" (UniqueName: \"kubernetes.io/projected/e4a5ca24-fef9-4cca-99c2-eb2c255ee795-kube-api-access-5wx88\") pod \"redhat-operators-jhjjp\" (UID: \"e4a5ca24-fef9-4cca-99c2-eb2c255ee795\") " pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.231489 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:45 crc kubenswrapper[4739]: E1008 21:53:45.428675 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0921b50a_3ca2_4f07_a060_63d6078eac48.slice/crio-conmon-f1989af4bc7ae6c6ebb1c0f0b23ddaeaae735b333529b7bdb8159bb91512d326.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0921b50a_3ca2_4f07_a060_63d6078eac48.slice/crio-f1989af4bc7ae6c6ebb1c0f0b23ddaeaae735b333529b7bdb8159bb91512d326.scope\": RecentStats: unable to find data in memory cache]" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.443085 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jhjjp"] Oct 08 21:53:45 crc kubenswrapper[4739]: W1008 21:53:45.450431 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a5ca24_fef9_4cca_99c2_eb2c255ee795.slice/crio-a755f59790c68d35260767657904427f2b179f3e3639cd43a80a6b969db05dae WatchSource:0}: Error finding container a755f59790c68d35260767657904427f2b179f3e3639cd43a80a6b969db05dae: Status 404 returned error can't find the container with id a755f59790c68d35260767657904427f2b179f3e3639cd43a80a6b969db05dae Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.828281 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13bcd021-6f45-4843-9255-92ee1ad4e031" path="/var/lib/kubelet/pods/13bcd021-6f45-4843-9255-92ee1ad4e031/volumes" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.829643 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b38b7a-4e93-49f1-907e-24fc371f31e3" path="/var/lib/kubelet/pods/34b38b7a-4e93-49f1-907e-24fc371f31e3/volumes" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.830212 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac894a1-e4ba-4a6b-8ac2-3a24ef886834" path="/var/lib/kubelet/pods/5ac894a1-e4ba-4a6b-8ac2-3a24ef886834/volumes" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.831506 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deed15fa-28ce-4057-a1d7-5f920fa4751b" path="/var/lib/kubelet/pods/deed15fa-28ce-4057-a1d7-5f920fa4751b/volumes" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.832234 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75f5e41-357c-47c7-b000-1103fd2b9756" path="/var/lib/kubelet/pods/e75f5e41-357c-47c7-b000-1103fd2b9756/volumes" Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.889648 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4a5ca24-fef9-4cca-99c2-eb2c255ee795" containerID="57536a75369ba2238fa15cf18e309b36e4dab0040f31935599a88372767e276e" exitCode=0 Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.889793 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhjjp" event={"ID":"e4a5ca24-fef9-4cca-99c2-eb2c255ee795","Type":"ContainerDied","Data":"57536a75369ba2238fa15cf18e309b36e4dab0040f31935599a88372767e276e"} Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.889830 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhjjp" event={"ID":"e4a5ca24-fef9-4cca-99c2-eb2c255ee795","Type":"ContainerStarted","Data":"a755f59790c68d35260767657904427f2b179f3e3639cd43a80a6b969db05dae"} Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.892530 4739 generic.go:334] "Generic (PLEG): container finished" podID="0921b50a-3ca2-4f07-a060-63d6078eac48" containerID="f1989af4bc7ae6c6ebb1c0f0b23ddaeaae735b333529b7bdb8159bb91512d326" exitCode=0 Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.893400 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbrj8" event={"ID":"0921b50a-3ca2-4f07-a060-63d6078eac48","Type":"ContainerDied","Data":"f1989af4bc7ae6c6ebb1c0f0b23ddaeaae735b333529b7bdb8159bb91512d326"} Oct 08 21:53:45 crc kubenswrapper[4739]: I1008 21:53:45.893432 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbrj8" event={"ID":"0921b50a-3ca2-4f07-a060-63d6078eac48","Type":"ContainerStarted","Data":"aa33f20535ad197a843989f871ea25af1427a14353c9ba0226864357e43b4688"} Oct 08 21:53:46 crc kubenswrapper[4739]: I1008 21:53:46.903863 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhjjp" event={"ID":"e4a5ca24-fef9-4cca-99c2-eb2c255ee795","Type":"ContainerStarted","Data":"e7fbf79c79dc013d9af91ec4054eb72606c5925806243a7f90eac899e2275e12"} Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.057479 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mmlrg"] Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.058710 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.065715 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.077835 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmlrg"] Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.198316 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8zq\" (UniqueName: \"kubernetes.io/projected/cf066fed-0185-4563-9992-0474c1761110-kube-api-access-vq8zq\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.198383 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-utilities\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.198872 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-catalog-content\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.258292 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qlrzs"] Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.259652 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.262382 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.269391 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlrzs"] Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.300365 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8zq\" (UniqueName: \"kubernetes.io/projected/cf066fed-0185-4563-9992-0474c1761110-kube-api-access-vq8zq\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.300428 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-utilities\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.300470 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-catalog-content\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.300943 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-catalog-content\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.301762 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf066fed-0185-4563-9992-0474c1761110-utilities\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.323305 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8zq\" (UniqueName: \"kubernetes.io/projected/cf066fed-0185-4563-9992-0474c1761110-kube-api-access-vq8zq\") pod \"community-operators-mmlrg\" (UID: \"cf066fed-0185-4563-9992-0474c1761110\") " pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.395087 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.402631 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-catalog-content\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.402738 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4jj6\" (UniqueName: \"kubernetes.io/projected/9243f10e-b903-4d49-9ef7-d447cf6459fd-kube-api-access-g4jj6\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.403191 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-utilities\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.507083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-utilities\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.507203 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-catalog-content\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.507264 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4jj6\" (UniqueName: \"kubernetes.io/projected/9243f10e-b903-4d49-9ef7-d447cf6459fd-kube-api-access-g4jj6\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.507764 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-utilities\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.508046 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9243f10e-b903-4d49-9ef7-d447cf6459fd-catalog-content\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.526981 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4jj6\" (UniqueName: \"kubernetes.io/projected/9243f10e-b903-4d49-9ef7-d447cf6459fd-kube-api-access-g4jj6\") pod \"certified-operators-qlrzs\" (UID: \"9243f10e-b903-4d49-9ef7-d447cf6459fd\") " pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.557813 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmlrg"] Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.582703 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.793685 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlrzs"] Oct 08 21:53:47 crc kubenswrapper[4739]: W1008 21:53:47.835095 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9243f10e_b903_4d49_9ef7_d447cf6459fd.slice/crio-39a3598b90bc3b11b007493b8ef171bbcca4340a7205bab88c8a66deda8fadc8 WatchSource:0}: Error finding container 39a3598b90bc3b11b007493b8ef171bbcca4340a7205bab88c8a66deda8fadc8: Status 404 returned error can't find the container with id 39a3598b90bc3b11b007493b8ef171bbcca4340a7205bab88c8a66deda8fadc8 Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.917714 4739 generic.go:334] "Generic (PLEG): container finished" podID="0921b50a-3ca2-4f07-a060-63d6078eac48" containerID="9cb5876a139a5522ec3ffb1cbb470a21e8112473911fe30686e1ebc71d6291b0" exitCode=0 Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.917812 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbrj8" event={"ID":"0921b50a-3ca2-4f07-a060-63d6078eac48","Type":"ContainerDied","Data":"9cb5876a139a5522ec3ffb1cbb470a21e8112473911fe30686e1ebc71d6291b0"} Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.918947 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlrzs" event={"ID":"9243f10e-b903-4d49-9ef7-d447cf6459fd","Type":"ContainerStarted","Data":"39a3598b90bc3b11b007493b8ef171bbcca4340a7205bab88c8a66deda8fadc8"} Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.930615 4739 generic.go:334] "Generic (PLEG): container finished" podID="cf066fed-0185-4563-9992-0474c1761110" containerID="5024ff1792df793075f609203f5322bdd10685ace8bba623c99dc0b39af50d90" exitCode=0 Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.930924 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmlrg" event={"ID":"cf066fed-0185-4563-9992-0474c1761110","Type":"ContainerDied","Data":"5024ff1792df793075f609203f5322bdd10685ace8bba623c99dc0b39af50d90"} Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.930966 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmlrg" event={"ID":"cf066fed-0185-4563-9992-0474c1761110","Type":"ContainerStarted","Data":"f98d3c6fb4f5a2ae92438eeb7f76c518e3b6e4d22682791dd07f3369840c635f"} Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.935375 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4a5ca24-fef9-4cca-99c2-eb2c255ee795" containerID="e7fbf79c79dc013d9af91ec4054eb72606c5925806243a7f90eac899e2275e12" exitCode=0 Oct 08 21:53:47 crc kubenswrapper[4739]: I1008 21:53:47.935416 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhjjp" event={"ID":"e4a5ca24-fef9-4cca-99c2-eb2c255ee795","Type":"ContainerDied","Data":"e7fbf79c79dc013d9af91ec4054eb72606c5925806243a7f90eac899e2275e12"} Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.952253 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbrj8" event={"ID":"0921b50a-3ca2-4f07-a060-63d6078eac48","Type":"ContainerStarted","Data":"daa91f7088d2ef03e00aa9cbf131e49cb6e1d62f2fcd5cc52ab334fefc1a2b48"} Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.958417 4739 generic.go:334] "Generic (PLEG): container finished" podID="9243f10e-b903-4d49-9ef7-d447cf6459fd" containerID="4f5872ade4221075d14867f8e1aae4eed78aa68a7cd54cd0132d357935d0d8cd" exitCode=0 Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.958520 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlrzs" event={"ID":"9243f10e-b903-4d49-9ef7-d447cf6459fd","Type":"ContainerDied","Data":"4f5872ade4221075d14867f8e1aae4eed78aa68a7cd54cd0132d357935d0d8cd"} Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.962798 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jhjjp" event={"ID":"e4a5ca24-fef9-4cca-99c2-eb2c255ee795","Type":"ContainerStarted","Data":"30094f4b43d204db8b7228ce71a741dec25719d6feb25a053d055ed414beec9a"} Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.970325 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbrj8" podStartSLOduration=2.490733024 podStartE2EDuration="4.97030975s" podCreationTimestamp="2025-10-08 21:53:44 +0000 UTC" firstStartedPulling="2025-10-08 21:53:45.894955441 +0000 UTC m=+325.720341191" lastFinishedPulling="2025-10-08 21:53:48.374532157 +0000 UTC m=+328.199917917" observedRunningTime="2025-10-08 21:53:48.969074198 +0000 UTC m=+328.794459958" watchObservedRunningTime="2025-10-08 21:53:48.97030975 +0000 UTC m=+328.795695500" Oct 08 21:53:48 crc kubenswrapper[4739]: I1008 21:53:48.987617 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jhjjp" podStartSLOduration=2.436285564 podStartE2EDuration="4.987596877s" podCreationTimestamp="2025-10-08 21:53:44 +0000 UTC" firstStartedPulling="2025-10-08 21:53:45.891538302 +0000 UTC m=+325.716924092" lastFinishedPulling="2025-10-08 21:53:48.442849655 +0000 UTC m=+328.268235405" observedRunningTime="2025-10-08 21:53:48.985817762 +0000 UTC m=+328.811203512" watchObservedRunningTime="2025-10-08 21:53:48.987596877 +0000 UTC m=+328.812982627" Oct 08 21:53:49 crc kubenswrapper[4739]: I1008 21:53:49.971713 4739 generic.go:334] "Generic (PLEG): container finished" podID="cf066fed-0185-4563-9992-0474c1761110" containerID="443b0ce9337af896b715a9d8553a862cac2a745ec97377fab0d5da8b0f11e6b0" exitCode=0 Oct 08 21:53:49 crc kubenswrapper[4739]: I1008 21:53:49.971943 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmlrg" event={"ID":"cf066fed-0185-4563-9992-0474c1761110","Type":"ContainerDied","Data":"443b0ce9337af896b715a9d8553a862cac2a745ec97377fab0d5da8b0f11e6b0"} Oct 08 21:53:49 crc kubenswrapper[4739]: I1008 21:53:49.981441 4739 generic.go:334] "Generic (PLEG): container finished" podID="9243f10e-b903-4d49-9ef7-d447cf6459fd" containerID="1234c72f835e4c2e9a5fa1c9e24abf90724d255d18d1f0ab99db06eeefd4acf8" exitCode=0 Oct 08 21:53:49 crc kubenswrapper[4739]: I1008 21:53:49.983163 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlrzs" event={"ID":"9243f10e-b903-4d49-9ef7-d447cf6459fd","Type":"ContainerDied","Data":"1234c72f835e4c2e9a5fa1c9e24abf90724d255d18d1f0ab99db06eeefd4acf8"} Oct 08 21:53:51 crc kubenswrapper[4739]: I1008 21:53:51.993728 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlrzs" event={"ID":"9243f10e-b903-4d49-9ef7-d447cf6459fd","Type":"ContainerStarted","Data":"335872d86b47daec5d781ddca7e885457752c607f5debadcf0c5b9963b80198d"} Oct 08 21:53:51 crc kubenswrapper[4739]: I1008 21:53:51.995966 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmlrg" event={"ID":"cf066fed-0185-4563-9992-0474c1761110","Type":"ContainerStarted","Data":"c750fcfa20d2c3c1b62902817d6e01854dc7310261fd6264b61b43342725666e"} Oct 08 21:53:52 crc kubenswrapper[4739]: I1008 21:53:52.020341 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qlrzs" podStartSLOduration=3.542549184 podStartE2EDuration="5.020325024s" podCreationTimestamp="2025-10-08 21:53:47 +0000 UTC" firstStartedPulling="2025-10-08 21:53:48.96023567 +0000 UTC m=+328.785621420" lastFinishedPulling="2025-10-08 21:53:50.43801149 +0000 UTC m=+330.263397260" observedRunningTime="2025-10-08 21:53:52.016067924 +0000 UTC m=+331.841453684" watchObservedRunningTime="2025-10-08 21:53:52.020325024 +0000 UTC m=+331.845710774" Oct 08 21:53:52 crc kubenswrapper[4739]: I1008 21:53:52.040020 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mmlrg" podStartSLOduration=2.553773535 podStartE2EDuration="5.039999883s" podCreationTimestamp="2025-10-08 21:53:47 +0000 UTC" firstStartedPulling="2025-10-08 21:53:47.936816074 +0000 UTC m=+327.762201824" lastFinishedPulling="2025-10-08 21:53:50.423042422 +0000 UTC m=+330.248428172" observedRunningTime="2025-10-08 21:53:52.039525621 +0000 UTC m=+331.864911401" watchObservedRunningTime="2025-10-08 21:53:52.039999883 +0000 UTC m=+331.865385633" Oct 08 21:53:54 crc kubenswrapper[4739]: I1008 21:53:54.995503 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:54 crc kubenswrapper[4739]: I1008 21:53:54.997267 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:55 crc kubenswrapper[4739]: I1008 21:53:55.039557 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:55 crc kubenswrapper[4739]: I1008 21:53:55.101991 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbrj8" Oct 08 21:53:55 crc kubenswrapper[4739]: I1008 21:53:55.232398 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:55 crc kubenswrapper[4739]: I1008 21:53:55.232555 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:55 crc kubenswrapper[4739]: I1008 21:53:55.271157 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:56 crc kubenswrapper[4739]: I1008 21:53:56.060364 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jhjjp" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.395293 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.395346 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.428236 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.583956 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.584019 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:57 crc kubenswrapper[4739]: I1008 21:53:57.627055 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:53:58 crc kubenswrapper[4739]: I1008 21:53:58.064607 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mmlrg" Oct 08 21:53:58 crc kubenswrapper[4739]: I1008 21:53:58.075934 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qlrzs" Oct 08 21:54:21 crc kubenswrapper[4739]: I1008 21:54:21.766405 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:54:21 crc kubenswrapper[4739]: I1008 21:54:21.766935 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:54:51 crc kubenswrapper[4739]: I1008 21:54:51.765909 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:54:51 crc kubenswrapper[4739]: I1008 21:54:51.766963 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:55:21 crc kubenswrapper[4739]: I1008 21:55:21.766330 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:55:21 crc kubenswrapper[4739]: I1008 21:55:21.767015 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:55:21 crc kubenswrapper[4739]: I1008 21:55:21.767071 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:55:21 crc kubenswrapper[4739]: I1008 21:55:21.767784 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:55:21 crc kubenswrapper[4739]: I1008 21:55:21.767860 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e" gracePeriod=600 Oct 08 21:55:22 crc kubenswrapper[4739]: I1008 21:55:22.533432 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e" exitCode=0 Oct 08 21:55:22 crc kubenswrapper[4739]: I1008 21:55:22.533561 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e"} Oct 08 21:55:22 crc kubenswrapper[4739]: I1008 21:55:22.533853 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2"} Oct 08 21:55:22 crc kubenswrapper[4739]: I1008 21:55:22.533895 4739 scope.go:117] "RemoveContainer" containerID="b3dbef8e78320c412e9809eb778c42a83c5aa9fa745e4e1500fed6d9aaa0ba3d" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.419655 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xsmdq"] Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.420895 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.428863 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xsmdq"] Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.579847 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.579907 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.579927 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-bound-sa-token\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.579950 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-tls\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.579976 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hkq\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-kube-api-access-t6hkq\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.580010 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-trusted-ca\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.580042 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-certificates\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.580205 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.601750 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.681702 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682047 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682078 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-bound-sa-token\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682101 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-tls\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682135 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hkq\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-kube-api-access-t6hkq\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682177 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-trusted-ca\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.682217 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-certificates\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.683329 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-certificates\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.684016 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-trusted-ca\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.684338 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.698961 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.699012 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-registry-tls\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.701777 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-bound-sa-token\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.702011 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hkq\" (UniqueName: \"kubernetes.io/projected/1e8c3b3e-0f3e-459d-bdb1-4093718a155c-kube-api-access-t6hkq\") pod \"image-registry-66df7c8f76-xsmdq\" (UID: \"1e8c3b3e-0f3e-459d-bdb1-4093718a155c\") " pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.739280 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:23 crc kubenswrapper[4739]: I1008 21:55:23.910090 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xsmdq"] Oct 08 21:55:23 crc kubenswrapper[4739]: W1008 21:55:23.910477 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8c3b3e_0f3e_459d_bdb1_4093718a155c.slice/crio-aabc18c79d800143ede92a912285be2586ac856629f02ab3492087328fd9eb96 WatchSource:0}: Error finding container aabc18c79d800143ede92a912285be2586ac856629f02ab3492087328fd9eb96: Status 404 returned error can't find the container with id aabc18c79d800143ede92a912285be2586ac856629f02ab3492087328fd9eb96 Oct 08 21:55:24 crc kubenswrapper[4739]: I1008 21:55:24.545105 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" event={"ID":"1e8c3b3e-0f3e-459d-bdb1-4093718a155c","Type":"ContainerStarted","Data":"0aff827d16939385e35c7b29619e2cf1dc3413c70e273f469da1bdb7041057a7"} Oct 08 21:55:24 crc kubenswrapper[4739]: I1008 21:55:24.545171 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" event={"ID":"1e8c3b3e-0f3e-459d-bdb1-4093718a155c","Type":"ContainerStarted","Data":"aabc18c79d800143ede92a912285be2586ac856629f02ab3492087328fd9eb96"} Oct 08 21:55:24 crc kubenswrapper[4739]: I1008 21:55:24.545309 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:24 crc kubenswrapper[4739]: I1008 21:55:24.569262 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" podStartSLOduration=1.569240532 podStartE2EDuration="1.569240532s" podCreationTimestamp="2025-10-08 21:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:55:24.563922765 +0000 UTC m=+424.389308515" watchObservedRunningTime="2025-10-08 21:55:24.569240532 +0000 UTC m=+424.394626292" Oct 08 21:55:43 crc kubenswrapper[4739]: I1008 21:55:43.746107 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xsmdq" Oct 08 21:55:43 crc kubenswrapper[4739]: I1008 21:55:43.814896 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:56:08 crc kubenswrapper[4739]: I1008 21:56:08.873462 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" podUID="595d6e92-80ce-40bf-8409-b50226a672ab" containerName="registry" containerID="cri-o://22641bbc692ff46e49c8eeb8d9ce8dadb60c200019458f63dbc0f675bd5b7f49" gracePeriod=30 Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.795488 4739 generic.go:334] "Generic (PLEG): container finished" podID="595d6e92-80ce-40bf-8409-b50226a672ab" containerID="22641bbc692ff46e49c8eeb8d9ce8dadb60c200019458f63dbc0f675bd5b7f49" exitCode=0 Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.795598 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" event={"ID":"595d6e92-80ce-40bf-8409-b50226a672ab","Type":"ContainerDied","Data":"22641bbc692ff46e49c8eeb8d9ce8dadb60c200019458f63dbc0f675bd5b7f49"} Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.831376 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.895832 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897008 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897034 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897063 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7prc\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897123 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897255 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897294 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.897362 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted\") pod \"595d6e92-80ce-40bf-8409-b50226a672ab\" (UID: \"595d6e92-80ce-40bf-8409-b50226a672ab\") " Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.896965 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.898981 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.905720 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.906342 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc" (OuterVolumeSpecName: "kube-api-access-c7prc") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "kube-api-access-c7prc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.907399 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.908347 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.908807 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.925396 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "595d6e92-80ce-40bf-8409-b50226a672ab" (UID: "595d6e92-80ce-40bf-8409-b50226a672ab"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.998636 4739 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/595d6e92-80ce-40bf-8409-b50226a672ab-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.999594 4739 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/595d6e92-80ce-40bf-8409-b50226a672ab-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.999732 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.999845 4739 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:09 crc kubenswrapper[4739]: I1008 21:56:09.999951 4739 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/595d6e92-80ce-40bf-8409-b50226a672ab-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.000056 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7prc\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-kube-api-access-c7prc\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.000204 4739 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/595d6e92-80ce-40bf-8409-b50226a672ab-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.814554 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" event={"ID":"595d6e92-80ce-40bf-8409-b50226a672ab","Type":"ContainerDied","Data":"475bf8bc6db1943d4d20b389ca2b348d343a490ccedc9b81d9bc29860e3a2bdf"} Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.814665 4739 scope.go:117] "RemoveContainer" containerID="22641bbc692ff46e49c8eeb8d9ce8dadb60c200019458f63dbc0f675bd5b7f49" Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.814731 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl6f2" Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.863453 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:56:10 crc kubenswrapper[4739]: I1008 21:56:10.868596 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl6f2"] Oct 08 21:56:11 crc kubenswrapper[4739]: I1008 21:56:11.833818 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="595d6e92-80ce-40bf-8409-b50226a672ab" path="/var/lib/kubelet/pods/595d6e92-80ce-40bf-8409-b50226a672ab/volumes" Oct 08 21:57:22 crc kubenswrapper[4739]: I1008 21:57:22.023904 4739 scope.go:117] "RemoveContainer" containerID="c0494c224f5afc809be5ec9f40bea9117b69195ea882ab16c451a09059e1f613" Oct 08 21:57:51 crc kubenswrapper[4739]: I1008 21:57:51.766658 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:57:51 crc kubenswrapper[4739]: I1008 21:57:51.768740 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:58:21 crc kubenswrapper[4739]: I1008 21:58:21.766411 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:58:21 crc kubenswrapper[4739]: I1008 21:58:21.767219 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:58:22 crc kubenswrapper[4739]: I1008 21:58:22.056226 4739 scope.go:117] "RemoveContainer" containerID="0eeb15c1b9674b7c1ddf972848d8f69b1848a017e027ac4abf55d4d48fb8542d" Oct 08 21:58:22 crc kubenswrapper[4739]: I1008 21:58:22.071683 4739 scope.go:117] "RemoveContainer" containerID="8520c7006ca61ec65f09811d18ed95676d04993057b8e41560df307f0686e83e" Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.766572 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.767178 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.767230 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.767953 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.768005 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2" gracePeriod=600 Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.917657 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2" exitCode=0 Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.917835 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2"} Oct 08 21:58:51 crc kubenswrapper[4739]: I1008 21:58:51.918109 4739 scope.go:117] "RemoveContainer" containerID="439f72fd59a864ed357fc655cbdfd636be4f725b2c076db55d4b40db6172c69e" Oct 08 21:58:52 crc kubenswrapper[4739]: I1008 21:58:52.927170 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8"} Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.327486 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf"] Oct 08 21:59:05 crc kubenswrapper[4739]: E1008 21:59:05.328229 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595d6e92-80ce-40bf-8409-b50226a672ab" containerName="registry" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.328242 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d6e92-80ce-40bf-8409-b50226a672ab" containerName="registry" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.328329 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="595d6e92-80ce-40bf-8409-b50226a672ab" containerName="registry" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.328980 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.331124 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.338129 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf"] Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.424960 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.425008 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.425054 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb6dg\" (UniqueName: \"kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.526489 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb6dg\" (UniqueName: \"kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.526590 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.526612 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.527035 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.527092 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.547779 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb6dg\" (UniqueName: \"kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:05 crc kubenswrapper[4739]: I1008 21:59:05.697079 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:06 crc kubenswrapper[4739]: I1008 21:59:06.107945 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf"] Oct 08 21:59:07 crc kubenswrapper[4739]: I1008 21:59:07.011580 4739 generic.go:334] "Generic (PLEG): container finished" podID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerID="8f3f05dace23e2e0bffc646f8876f6047a43050a343668b868edb21d5e12f728" exitCode=0 Oct 08 21:59:07 crc kubenswrapper[4739]: I1008 21:59:07.011651 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" event={"ID":"174d772a-ebc6-46bf-ab5f-02cdc6564283","Type":"ContainerDied","Data":"8f3f05dace23e2e0bffc646f8876f6047a43050a343668b868edb21d5e12f728"} Oct 08 21:59:07 crc kubenswrapper[4739]: I1008 21:59:07.012016 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" event={"ID":"174d772a-ebc6-46bf-ab5f-02cdc6564283","Type":"ContainerStarted","Data":"335e3af5b734672ebd85256989ee71aa2decd6ce27199deba4b3707dafde5730"} Oct 08 21:59:07 crc kubenswrapper[4739]: I1008 21:59:07.015243 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 21:59:09 crc kubenswrapper[4739]: I1008 21:59:09.024205 4739 generic.go:334] "Generic (PLEG): container finished" podID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerID="72fefc15dea7e05656fa6215791328c208a7265522fae984aaf04b118253d86a" exitCode=0 Oct 08 21:59:09 crc kubenswrapper[4739]: I1008 21:59:09.024311 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" event={"ID":"174d772a-ebc6-46bf-ab5f-02cdc6564283","Type":"ContainerDied","Data":"72fefc15dea7e05656fa6215791328c208a7265522fae984aaf04b118253d86a"} Oct 08 21:59:10 crc kubenswrapper[4739]: I1008 21:59:10.032680 4739 generic.go:334] "Generic (PLEG): container finished" podID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerID="53f422aba0efa175d3628af0b6cb79133167e2b40dbe03db19b96680a9b0bf19" exitCode=0 Oct 08 21:59:10 crc kubenswrapper[4739]: I1008 21:59:10.032765 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" event={"ID":"174d772a-ebc6-46bf-ab5f-02cdc6564283","Type":"ContainerDied","Data":"53f422aba0efa175d3628af0b6cb79133167e2b40dbe03db19b96680a9b0bf19"} Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.226294 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.304056 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle\") pod \"174d772a-ebc6-46bf-ab5f-02cdc6564283\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.304167 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb6dg\" (UniqueName: \"kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg\") pod \"174d772a-ebc6-46bf-ab5f-02cdc6564283\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.304203 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util\") pod \"174d772a-ebc6-46bf-ab5f-02cdc6564283\" (UID: \"174d772a-ebc6-46bf-ab5f-02cdc6564283\") " Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.307581 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle" (OuterVolumeSpecName: "bundle") pod "174d772a-ebc6-46bf-ab5f-02cdc6564283" (UID: "174d772a-ebc6-46bf-ab5f-02cdc6564283"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.311604 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg" (OuterVolumeSpecName: "kube-api-access-vb6dg") pod "174d772a-ebc6-46bf-ab5f-02cdc6564283" (UID: "174d772a-ebc6-46bf-ab5f-02cdc6564283"). InnerVolumeSpecName "kube-api-access-vb6dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.321379 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util" (OuterVolumeSpecName: "util") pod "174d772a-ebc6-46bf-ab5f-02cdc6564283" (UID: "174d772a-ebc6-46bf-ab5f-02cdc6564283"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.405973 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.406010 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb6dg\" (UniqueName: \"kubernetes.io/projected/174d772a-ebc6-46bf-ab5f-02cdc6564283-kube-api-access-vb6dg\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:11 crc kubenswrapper[4739]: I1008 21:59:11.406025 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174d772a-ebc6-46bf-ab5f-02cdc6564283-util\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:12 crc kubenswrapper[4739]: I1008 21:59:12.045649 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" event={"ID":"174d772a-ebc6-46bf-ab5f-02cdc6564283","Type":"ContainerDied","Data":"335e3af5b734672ebd85256989ee71aa2decd6ce27199deba4b3707dafde5730"} Oct 08 21:59:12 crc kubenswrapper[4739]: I1008 21:59:12.045759 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="335e3af5b734672ebd85256989ee71aa2decd6ce27199deba4b3707dafde5730" Oct 08 21:59:12 crc kubenswrapper[4739]: I1008 21:59:12.045690 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf" Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.975985 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hfhrc"] Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977122 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-controller" containerID="cri-o://43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977205 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977207 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="nbdb" containerID="cri-o://f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977330 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="northd" containerID="cri-o://d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977435 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-node" containerID="cri-o://e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977413 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-acl-logging" containerID="cri-o://a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" gracePeriod=30 Oct 08 21:59:16 crc kubenswrapper[4739]: I1008 21:59:16.977418 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="sbdb" containerID="cri-o://6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" gracePeriod=30 Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.022881 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" containerID="cri-o://c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" gracePeriod=30 Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.086524 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/2.log" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.087719 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/1.log" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.087795 4739 generic.go:334] "Generic (PLEG): container finished" podID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" containerID="93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88" exitCode=2 Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.087852 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerDied","Data":"93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88"} Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.087915 4739 scope.go:117] "RemoveContainer" containerID="9a3ec9cc2ce1e0c0c740753d759e9f091e402d73cb1d4f896fe843f9bfb805ea" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.088954 4739 scope.go:117] "RemoveContainer" containerID="93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.089290 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wwt88_openshift-multus(17ed1d5a-5f21-4dcf-bdb9-09e715f57027)\"" pod="openshift-multus/multus-wwt88" podUID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.359727 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/3.log" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.362070 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovn-acl-logging/0.log" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.362549 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovn-controller/0.log" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.362932 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480676 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z9qjt"] Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480862 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="extract" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480874 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="extract" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480882 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480888 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480898 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480903 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480910 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480915 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480923 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480930 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480937 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="northd" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480943 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="northd" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480950 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-acl-logging" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480956 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-acl-logging" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480962 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="nbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480968 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="nbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480980 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="pull" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480985 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="pull" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.480994 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.480999 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481007 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="sbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481012 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="sbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481020 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481026 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481035 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kubecfg-setup" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481040 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kubecfg-setup" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481048 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="util" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481055 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="util" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481062 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-node" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481067 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-node" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481168 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="northd" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481179 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="sbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481187 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481194 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481202 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481211 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481218 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-acl-logging" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481226 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovn-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481233 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="nbdb" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481241 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="kube-rbac-proxy-node" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481248 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481256 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="174d772a-ebc6-46bf-ab5f-02cdc6564283" containerName="extract" Oct 08 21:59:17 crc kubenswrapper[4739]: E1008 21:59:17.481339 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481346 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.481424 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerName="ovnkube-controller" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.482814 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500237 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500282 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500314 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500350 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500369 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500377 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500396 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500481 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500522 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500546 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500564 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500557 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log" (OuterVolumeSpecName: "node-log") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500590 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500611 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500619 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500643 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500644 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500650 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500693 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500674 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500686 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500725 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500747 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500663 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket" (OuterVolumeSpecName: "log-socket") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500766 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500731 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500782 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash" (OuterVolumeSpecName: "host-slash") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500848 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500865 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500872 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500893 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500952 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500967 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.500988 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch\") pod \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\" (UID: \"4c6641d9-9ccf-42aa-8a83-c52d850aa766\") " Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501041 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501069 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501513 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501565 4739 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501581 4739 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501596 4739 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-slash\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501606 4739 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501616 4739 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501626 4739 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501636 4739 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501647 4739 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501661 4739 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-node-log\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501672 4739 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501683 4739 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501694 4739 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501703 4739 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501713 4739 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501722 4739 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.501732 4739 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-log-socket\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.513772 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.520398 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj" (OuterVolumeSpecName: "kube-api-access-rrwfj") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "kube-api-access-rrwfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.532664 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4c6641d9-9ccf-42aa-8a83-c52d850aa766" (UID: "4c6641d9-9ccf-42aa-8a83-c52d850aa766"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.602971 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-systemd-units\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603041 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhj9t\" (UniqueName: \"kubernetes.io/projected/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-kube-api-access-vhj9t\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603073 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-netd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603095 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-config\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603114 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603137 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-kubelet\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603175 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovn-node-metrics-cert\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603222 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-node-log\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603242 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-netns\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603268 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-var-lib-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603286 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603313 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-systemd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603348 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-ovn\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603371 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603394 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-env-overrides\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603418 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-slash\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603435 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-log-socket\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603452 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-etc-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603470 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-script-lib\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603495 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-bin\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603546 4739 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c6641d9-9ccf-42aa-8a83-c52d850aa766-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603561 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwfj\" (UniqueName: \"kubernetes.io/projected/4c6641d9-9ccf-42aa-8a83-c52d850aa766-kube-api-access-rrwfj\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603577 4739 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.603591 4739 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c6641d9-9ccf-42aa-8a83-c52d850aa766-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704453 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-etc-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704511 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-script-lib\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704539 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-bin\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704578 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-systemd-units\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704586 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-etc-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704605 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhj9t\" (UniqueName: \"kubernetes.io/projected/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-kube-api-access-vhj9t\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704681 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-systemd-units\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704695 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-netd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704719 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-netd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704725 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-config\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704706 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-cni-bin\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704752 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704827 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-kubelet\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704871 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovn-node-metrics-cert\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704921 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-node-log\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704939 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-netns\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704954 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-var-lib-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704971 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705032 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-systemd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705069 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-ovn\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705078 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-node-log\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705089 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705116 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705130 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-env-overrides\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705161 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-netns\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705187 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-var-lib-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705195 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-slash\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705210 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-openvswitch\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705219 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-log-socket\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705234 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-systemd\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.704776 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705291 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-kubelet\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705334 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-log-socket\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705362 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-host-slash\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705383 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-run-ovn\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705636 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-config\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705739 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovnkube-script-lib\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.705896 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-env-overrides\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.708716 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-ovn-node-metrics-cert\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.727708 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhj9t\" (UniqueName: \"kubernetes.io/projected/cebaab22-7e46-4c0c-be86-4f6d53ee35b1-kube-api-access-vhj9t\") pod \"ovnkube-node-z9qjt\" (UID: \"cebaab22-7e46-4c0c-be86-4f6d53ee35b1\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:17 crc kubenswrapper[4739]: I1008 21:59:17.797867 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.093027 4739 generic.go:334] "Generic (PLEG): container finished" podID="cebaab22-7e46-4c0c-be86-4f6d53ee35b1" containerID="386567e5cb36b7dd82e46c672b0428d8d05515195b55955caeda964c135491ba" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.093083 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerDied","Data":"386567e5cb36b7dd82e46c672b0428d8d05515195b55955caeda964c135491ba"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.093108 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"7d594100567d6a7316332aa25d71be18d49f3502b03cabc517cc746f2afaf42c"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.095333 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/2.log" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.097464 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovnkube-controller/3.log" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.100528 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovn-acl-logging/0.log" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101136 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hfhrc_4c6641d9-9ccf-42aa-8a83-c52d850aa766/ovn-controller/0.log" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101473 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101492 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101504 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101514 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101521 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101527 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" exitCode=0 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101533 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" exitCode=143 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101540 4739 generic.go:334] "Generic (PLEG): container finished" podID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" exitCode=143 Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101555 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101571 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101583 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101573 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101712 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101726 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101737 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101747 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101757 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101766 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101778 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101785 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101791 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101797 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101802 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101807 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101812 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101819 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101827 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101833 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101838 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101843 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101849 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101854 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101859 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101864 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101868 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101873 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101881 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101888 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101893 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101898 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101904 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101908 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101913 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101918 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101922 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101927 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101932 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101939 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfhrc" event={"ID":"4c6641d9-9ccf-42aa-8a83-c52d850aa766","Type":"ContainerDied","Data":"5ac3717ac6722e968798dbac2846896abeebcd93e652c4a6f093503b0c023137"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101946 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101951 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101956 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101961 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101965 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101970 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101974 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101979 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101984 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.101989 4739 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.155368 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hfhrc"] Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.163975 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hfhrc"] Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.164852 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.198820 4739 scope.go:117] "RemoveContainer" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.226839 4739 scope.go:117] "RemoveContainer" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.240084 4739 scope.go:117] "RemoveContainer" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.256098 4739 scope.go:117] "RemoveContainer" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.280184 4739 scope.go:117] "RemoveContainer" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.311268 4739 scope.go:117] "RemoveContainer" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.343975 4739 scope.go:117] "RemoveContainer" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.366398 4739 scope.go:117] "RemoveContainer" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.398343 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.398811 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.398862 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} err="failed to get container status \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.398888 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.402554 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": container with ID starting with 1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a not found: ID does not exist" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.402615 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} err="failed to get container status \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": rpc error: code = NotFound desc = could not find container \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": container with ID starting with 1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.402641 4739 scope.go:117] "RemoveContainer" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.402890 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": container with ID starting with 6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e not found: ID does not exist" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.402918 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} err="failed to get container status \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": rpc error: code = NotFound desc = could not find container \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": container with ID starting with 6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.402935 4739 scope.go:117] "RemoveContainer" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.403167 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": container with ID starting with f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a not found: ID does not exist" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.403191 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} err="failed to get container status \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": rpc error: code = NotFound desc = could not find container \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": container with ID starting with f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.403208 4739 scope.go:117] "RemoveContainer" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.403683 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": container with ID starting with d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf not found: ID does not exist" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.403707 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} err="failed to get container status \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": rpc error: code = NotFound desc = could not find container \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": container with ID starting with d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.403720 4739 scope.go:117] "RemoveContainer" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.403998 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": container with ID starting with a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42 not found: ID does not exist" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.404024 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} err="failed to get container status \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": rpc error: code = NotFound desc = could not find container \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": container with ID starting with a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.404039 4739 scope.go:117] "RemoveContainer" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.404281 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": container with ID starting with e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6 not found: ID does not exist" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.404308 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} err="failed to get container status \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": rpc error: code = NotFound desc = could not find container \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": container with ID starting with e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.404325 4739 scope.go:117] "RemoveContainer" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.405247 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": container with ID starting with a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2 not found: ID does not exist" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.405439 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} err="failed to get container status \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": rpc error: code = NotFound desc = could not find container \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": container with ID starting with a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.405462 4739 scope.go:117] "RemoveContainer" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.409600 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": container with ID starting with 43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753 not found: ID does not exist" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.409636 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} err="failed to get container status \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": rpc error: code = NotFound desc = could not find container \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": container with ID starting with 43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.409653 4739 scope.go:117] "RemoveContainer" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: E1008 21:59:18.410880 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": container with ID starting with 5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b not found: ID does not exist" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.410903 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} err="failed to get container status \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": rpc error: code = NotFound desc = could not find container \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": container with ID starting with 5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.410919 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.413499 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} err="failed to get container status \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.413540 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.414244 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} err="failed to get container status \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": rpc error: code = NotFound desc = could not find container \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": container with ID starting with 1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.414283 4739 scope.go:117] "RemoveContainer" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.417421 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} err="failed to get container status \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": rpc error: code = NotFound desc = could not find container \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": container with ID starting with 6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.417444 4739 scope.go:117] "RemoveContainer" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.417658 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} err="failed to get container status \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": rpc error: code = NotFound desc = could not find container \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": container with ID starting with f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.417681 4739 scope.go:117] "RemoveContainer" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418122 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} err="failed to get container status \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": rpc error: code = NotFound desc = could not find container \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": container with ID starting with d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418181 4739 scope.go:117] "RemoveContainer" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418503 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} err="failed to get container status \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": rpc error: code = NotFound desc = could not find container \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": container with ID starting with a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418524 4739 scope.go:117] "RemoveContainer" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418711 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} err="failed to get container status \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": rpc error: code = NotFound desc = could not find container \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": container with ID starting with e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418732 4739 scope.go:117] "RemoveContainer" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418937 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} err="failed to get container status \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": rpc error: code = NotFound desc = could not find container \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": container with ID starting with a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.418959 4739 scope.go:117] "RemoveContainer" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419242 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} err="failed to get container status \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": rpc error: code = NotFound desc = could not find container \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": container with ID starting with 43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419269 4739 scope.go:117] "RemoveContainer" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419468 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} err="failed to get container status \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": rpc error: code = NotFound desc = could not find container \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": container with ID starting with 5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419488 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419694 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} err="failed to get container status \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419713 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419932 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} err="failed to get container status \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": rpc error: code = NotFound desc = could not find container \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": container with ID starting with 1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.419951 4739 scope.go:117] "RemoveContainer" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420139 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} err="failed to get container status \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": rpc error: code = NotFound desc = could not find container \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": container with ID starting with 6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420173 4739 scope.go:117] "RemoveContainer" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420382 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} err="failed to get container status \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": rpc error: code = NotFound desc = could not find container \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": container with ID starting with f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420401 4739 scope.go:117] "RemoveContainer" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420586 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} err="failed to get container status \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": rpc error: code = NotFound desc = could not find container \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": container with ID starting with d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420609 4739 scope.go:117] "RemoveContainer" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420830 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} err="failed to get container status \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": rpc error: code = NotFound desc = could not find container \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": container with ID starting with a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.420846 4739 scope.go:117] "RemoveContainer" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.421041 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} err="failed to get container status \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": rpc error: code = NotFound desc = could not find container \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": container with ID starting with e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.421060 4739 scope.go:117] "RemoveContainer" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.421305 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} err="failed to get container status \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": rpc error: code = NotFound desc = could not find container \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": container with ID starting with a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.421323 4739 scope.go:117] "RemoveContainer" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422129 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} err="failed to get container status \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": rpc error: code = NotFound desc = could not find container \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": container with ID starting with 43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422181 4739 scope.go:117] "RemoveContainer" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422442 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} err="failed to get container status \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": rpc error: code = NotFound desc = could not find container \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": container with ID starting with 5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422462 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422708 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} err="failed to get container status \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422726 4739 scope.go:117] "RemoveContainer" containerID="1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.422993 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a"} err="failed to get container status \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": rpc error: code = NotFound desc = could not find container \"1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a\": container with ID starting with 1131dafe6ec787bf9ddfc9e2fd4151508818cdf9b33cc554e36a8bd0aa32400a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423015 4739 scope.go:117] "RemoveContainer" containerID="6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423256 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e"} err="failed to get container status \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": rpc error: code = NotFound desc = could not find container \"6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e\": container with ID starting with 6e55b2e87b401f819372f8c4f73c5611dbbd9d15d4068bf5293bf51ce5f3dd7e not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423275 4739 scope.go:117] "RemoveContainer" containerID="f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423521 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a"} err="failed to get container status \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": rpc error: code = NotFound desc = could not find container \"f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a\": container with ID starting with f661a51bc69aab61fa61e3291714548648b911ae235799029e17cdd3990b339a not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423538 4739 scope.go:117] "RemoveContainer" containerID="d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423746 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf"} err="failed to get container status \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": rpc error: code = NotFound desc = could not find container \"d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf\": container with ID starting with d0e2230f41ecdc117a6790110fb7b8e53b5aff8a382de4e7055824bd44977cbf not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423762 4739 scope.go:117] "RemoveContainer" containerID="a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423964 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42"} err="failed to get container status \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": rpc error: code = NotFound desc = could not find container \"a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42\": container with ID starting with a7e1f2d4d27c90f6b91a45d418eb72f17116fe58dd1a236ed95ac4b83770eb42 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.423982 4739 scope.go:117] "RemoveContainer" containerID="e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424189 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6"} err="failed to get container status \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": rpc error: code = NotFound desc = could not find container \"e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6\": container with ID starting with e20a59e35d800dd7d3527617d37855502fe8984042a3e10ae97f6a2e4769e6f6 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424206 4739 scope.go:117] "RemoveContainer" containerID="a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424499 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2"} err="failed to get container status \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": rpc error: code = NotFound desc = could not find container \"a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2\": container with ID starting with a1e58f6538196dfd2e0d5d418280ee4c46d7a552902e664b684409bcf58ddcb2 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424516 4739 scope.go:117] "RemoveContainer" containerID="43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424758 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753"} err="failed to get container status \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": rpc error: code = NotFound desc = could not find container \"43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753\": container with ID starting with 43ec60725848836360c23c4ae68271af42c298c01e5ced35c4a75174a30c2753 not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.424774 4739 scope.go:117] "RemoveContainer" containerID="5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.425021 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b"} err="failed to get container status \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": rpc error: code = NotFound desc = could not find container \"5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b\": container with ID starting with 5230d655fc91d8528b5b8b9b39e992e93b217b6e86fcb5fe9ef5793a84c5e42b not found: ID does not exist" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.425038 4739 scope.go:117] "RemoveContainer" containerID="c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5" Oct 08 21:59:18 crc kubenswrapper[4739]: I1008 21:59:18.425299 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5"} err="failed to get container status \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": rpc error: code = NotFound desc = could not find container \"c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5\": container with ID starting with c616abab131eedfb321967e9e1907708b72a0e840894c3790aa8edd96c04d0f5 not found: ID does not exist" Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.106804 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"66f7f19cbabb709b552a7c86af91cc344fb15ad72dc8bf44b91b00e761a8942a"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.107129 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"72e87c90d4598944f8dce869e8c4bfb2109b6839e0d3a6c4525eb0dd52f10b5a"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.107140 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"59bfde44ff4a1e4b238728ca5fbcd5139d90e03ab99f490b14a07ca6a811f7ee"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.107166 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"4f561ecf30c86da0e84b6f615cf14cffee6a81b1b324890beea3b738504d82c7"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.107173 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"963a4b08b9147e4b57840bf454d9ded0aff5476180da609dabf15b23348b60b3"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.107183 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"a45d1bc3d16a62d2a0b230a8a442d859c9f4587fe08e602a396f811a20fd6bd8"} Oct 08 21:59:19 crc kubenswrapper[4739]: I1008 21:59:19.827482 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6641d9-9ccf-42aa-8a83-c52d850aa766" path="/var/lib/kubelet/pods/4c6641d9-9ccf-42aa-8a83-c52d850aa766/volumes" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.762011 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg"] Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.763204 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.764912 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.764956 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-v785l" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.765257 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.875922 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs"] Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.877103 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.879092 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.879764 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-f7z44" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.885696 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd"] Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.886473 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:21 crc kubenswrapper[4739]: I1008 21:59:21.956051 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh7bf\" (UniqueName: \"kubernetes.io/projected/bbafdc6e-b606-4274-aebb-eb1d38bf693e-kube-api-access-bh7bf\") pod \"obo-prometheus-operator-7c8cf85677-7fkfg\" (UID: \"bbafdc6e-b606-4274-aebb-eb1d38bf693e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.056956 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.057043 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.057074 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh7bf\" (UniqueName: \"kubernetes.io/projected/bbafdc6e-b606-4274-aebb-eb1d38bf693e-kube-api-access-bh7bf\") pod \"obo-prometheus-operator-7c8cf85677-7fkfg\" (UID: \"bbafdc6e-b606-4274-aebb-eb1d38bf693e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.057122 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.057245 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.084643 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh7bf\" (UniqueName: \"kubernetes.io/projected/bbafdc6e-b606-4274-aebb-eb1d38bf693e-kube-api-access-bh7bf\") pod \"obo-prometheus-operator-7c8cf85677-7fkfg\" (UID: \"bbafdc6e-b606-4274-aebb-eb1d38bf693e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.099015 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6kpbp"] Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.100544 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.103803 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-s6dtp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.104169 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.157880 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.157936 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.158013 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.158052 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.165499 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.165530 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.165906 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/711de91c-2cc4-4161-ac30-0de8e68283d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs\" (UID: \"711de91c-2cc4-4161-ac30-0de8e68283d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.168577 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd\" (UID: \"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.195373 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"f05a2243c499b83f668680e24212c06a77758d10f6666ee567a79f7d3f633875"} Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.198549 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.207557 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.240587 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bpxcp"] Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.241679 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.247484 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lskfx" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.252765 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(70f360d1afc5789cbbbea45368b04534e8ff17b48d5ea2961abeee6b5c121839): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.252820 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(70f360d1afc5789cbbbea45368b04534e8ff17b48d5ea2961abeee6b5c121839): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.252842 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(70f360d1afc5789cbbbea45368b04534e8ff17b48d5ea2961abeee6b5c121839): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.252883 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(70f360d1afc5789cbbbea45368b04534e8ff17b48d5ea2961abeee6b5c121839): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" podUID="711de91c-2cc4-4161-ac30-0de8e68283d5" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.258963 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c13b744-b744-49f3-8ba5-241ab69fdab9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.259017 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zxk\" (UniqueName: \"kubernetes.io/projected/4c13b744-b744-49f3-8ba5-241ab69fdab9-kube-api-access-z6zxk\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.261027 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(b5e99be0e8d164165b5ffd5afe6b2f5ea611df2b2090e5705745fc74e13bdb9d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.261085 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(b5e99be0e8d164165b5ffd5afe6b2f5ea611df2b2090e5705745fc74e13bdb9d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.261107 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(b5e99be0e8d164165b5ffd5afe6b2f5ea611df2b2090e5705745fc74e13bdb9d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.261167 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(b5e99be0e8d164165b5ffd5afe6b2f5ea611df2b2090e5705745fc74e13bdb9d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" podUID="80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.359919 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6khc\" (UniqueName: \"kubernetes.io/projected/5c19c761-fcc4-474d-9d87-7c2e07755190-kube-api-access-p6khc\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.359998 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c19c761-fcc4-474d-9d87-7c2e07755190-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.360036 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c13b744-b744-49f3-8ba5-241ab69fdab9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.360077 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zxk\" (UniqueName: \"kubernetes.io/projected/4c13b744-b744-49f3-8ba5-241ab69fdab9-kube-api-access-z6zxk\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.363993 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c13b744-b744-49f3-8ba5-241ab69fdab9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.379969 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zxk\" (UniqueName: \"kubernetes.io/projected/4c13b744-b744-49f3-8ba5-241ab69fdab9-kube-api-access-z6zxk\") pod \"observability-operator-cc5f78dfc-6kpbp\" (UID: \"4c13b744-b744-49f3-8ba5-241ab69fdab9\") " pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.381475 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.404013 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(1394c35485f4fcb24b445992f2b6ebc2b23e3ce6f7afa85dfc8d2e0f81c50377): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.404078 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(1394c35485f4fcb24b445992f2b6ebc2b23e3ce6f7afa85dfc8d2e0f81c50377): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.404102 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(1394c35485f4fcb24b445992f2b6ebc2b23e3ce6f7afa85dfc8d2e0f81c50377): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.404160 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(1394c35485f4fcb24b445992f2b6ebc2b23e3ce6f7afa85dfc8d2e0f81c50377): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" podUID="bbafdc6e-b606-4274-aebb-eb1d38bf693e" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.425012 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.441286 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(76f198b3c46d7361cde5f4b0c522e01a00384007f7c1b5da14bacd0ed9d357ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.441353 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(76f198b3c46d7361cde5f4b0c522e01a00384007f7c1b5da14bacd0ed9d357ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.441381 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(76f198b3c46d7361cde5f4b0c522e01a00384007f7c1b5da14bacd0ed9d357ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.441425 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(76f198b3c46d7361cde5f4b0c522e01a00384007f7c1b5da14bacd0ed9d357ee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" podUID="4c13b744-b744-49f3-8ba5-241ab69fdab9" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.461533 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6khc\" (UniqueName: \"kubernetes.io/projected/5c19c761-fcc4-474d-9d87-7c2e07755190-kube-api-access-p6khc\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.461618 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c19c761-fcc4-474d-9d87-7c2e07755190-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.462499 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c19c761-fcc4-474d-9d87-7c2e07755190-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.489759 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6khc\" (UniqueName: \"kubernetes.io/projected/5c19c761-fcc4-474d-9d87-7c2e07755190-kube-api-access-p6khc\") pod \"perses-operator-54bc95c9fb-bpxcp\" (UID: \"5c19c761-fcc4-474d-9d87-7c2e07755190\") " pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: I1008 21:59:22.572232 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.592581 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(1b01974855f69d4e09cdb197c2263fa9f4b990065b46e56d42cd97e1f5a810e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.592711 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(1b01974855f69d4e09cdb197c2263fa9f4b990065b46e56d42cd97e1f5a810e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.592777 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(1b01974855f69d4e09cdb197c2263fa9f4b990065b46e56d42cd97e1f5a810e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:22 crc kubenswrapper[4739]: E1008 21:59:22.592867 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(1b01974855f69d4e09cdb197c2263fa9f4b990065b46e56d42cd97e1f5a810e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" podUID="5c19c761-fcc4-474d-9d87-7c2e07755190" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.208134 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bpxcp"] Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.208234 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.208605 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.210928 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" event={"ID":"cebaab22-7e46-4c0c-be86-4f6d53ee35b1","Type":"ContainerStarted","Data":"aa8f7ba3a29dcca25f1ffd72fbd9a1212ca9cfad67ab28e0c3dcaf1a88b52ea9"} Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.211205 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.211307 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.211344 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.216181 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs"] Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.216269 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.216606 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.223501 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg"] Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.223732 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.224228 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.240107 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(663f78bdc8fdc48e1069333b77183bfb2f9fa1b0964eb9f201c407503d153387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.240515 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(663f78bdc8fdc48e1069333b77183bfb2f9fa1b0964eb9f201c407503d153387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.240587 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(663f78bdc8fdc48e1069333b77183bfb2f9fa1b0964eb9f201c407503d153387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.240692 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(663f78bdc8fdc48e1069333b77183bfb2f9fa1b0964eb9f201c407503d153387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" podUID="5c19c761-fcc4-474d-9d87-7c2e07755190" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.246330 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.247768 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6kpbp"] Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.247918 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.248338 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.252229 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" podStartSLOduration=7.252198182 podStartE2EDuration="7.252198182s" podCreationTimestamp="2025-10-08 21:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 21:59:24.245002885 +0000 UTC m=+664.070388655" watchObservedRunningTime="2025-10-08 21:59:24.252198182 +0000 UTC m=+664.077583932" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.268435 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.272175 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd"] Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.272304 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:24 crc kubenswrapper[4739]: I1008 21:59:24.272716 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.276346 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(7058ec30fe9a05f503024db2f59f3831f6ca07ccb57cb4dfed0ddde669fcf271): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.276429 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(7058ec30fe9a05f503024db2f59f3831f6ca07ccb57cb4dfed0ddde669fcf271): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.276458 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(7058ec30fe9a05f503024db2f59f3831f6ca07ccb57cb4dfed0ddde669fcf271): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.276503 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(7058ec30fe9a05f503024db2f59f3831f6ca07ccb57cb4dfed0ddde669fcf271): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" podUID="711de91c-2cc4-4161-ac30-0de8e68283d5" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.282333 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(0f6cfa2986a6881ecf40010afda485fc717b9a41e05020348a4aafc1a82674e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.282483 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(0f6cfa2986a6881ecf40010afda485fc717b9a41e05020348a4aafc1a82674e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.282546 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(0f6cfa2986a6881ecf40010afda485fc717b9a41e05020348a4aafc1a82674e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.282630 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(0f6cfa2986a6881ecf40010afda485fc717b9a41e05020348a4aafc1a82674e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" podUID="bbafdc6e-b606-4274-aebb-eb1d38bf693e" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.306769 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(414fb152f74c39fe3b27682b6731b9a865142dbfd46ef5d14a4de74a79e219dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.306838 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(414fb152f74c39fe3b27682b6731b9a865142dbfd46ef5d14a4de74a79e219dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.306865 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(414fb152f74c39fe3b27682b6731b9a865142dbfd46ef5d14a4de74a79e219dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.306919 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(414fb152f74c39fe3b27682b6731b9a865142dbfd46ef5d14a4de74a79e219dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" podUID="4c13b744-b744-49f3-8ba5-241ab69fdab9" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.317243 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(dbd58ac281c00e396e96b3f2bd630c7582e7f21bd86994f2d05e70e55d98efe2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.317317 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(dbd58ac281c00e396e96b3f2bd630c7582e7f21bd86994f2d05e70e55d98efe2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.317336 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(dbd58ac281c00e396e96b3f2bd630c7582e7f21bd86994f2d05e70e55d98efe2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:24 crc kubenswrapper[4739]: E1008 21:59:24.317373 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(dbd58ac281c00e396e96b3f2bd630c7582e7f21bd86994f2d05e70e55d98efe2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" podUID="80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2" Oct 08 21:59:29 crc kubenswrapper[4739]: I1008 21:59:29.821894 4739 scope.go:117] "RemoveContainer" containerID="93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88" Oct 08 21:59:29 crc kubenswrapper[4739]: E1008 21:59:29.822442 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wwt88_openshift-multus(17ed1d5a-5f21-4dcf-bdb9-09e715f57027)\"" pod="openshift-multus/multus-wwt88" podUID="17ed1d5a-5f21-4dcf-bdb9-09e715f57027" Oct 08 21:59:35 crc kubenswrapper[4739]: I1008 21:59:35.821548 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:35 crc kubenswrapper[4739]: I1008 21:59:35.822405 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:35 crc kubenswrapper[4739]: E1008 21:59:35.847816 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(d42c9535d1f12170421f8c4ad29abd3e8fcb8aedc994c0c92e17248dfc5f4fd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:35 crc kubenswrapper[4739]: E1008 21:59:35.847897 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(d42c9535d1f12170421f8c4ad29abd3e8fcb8aedc994c0c92e17248dfc5f4fd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:35 crc kubenswrapper[4739]: E1008 21:59:35.847925 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(d42c9535d1f12170421f8c4ad29abd3e8fcb8aedc994c0c92e17248dfc5f4fd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:35 crc kubenswrapper[4739]: E1008 21:59:35.847985 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bpxcp_openshift-operators(5c19c761-fcc4-474d-9d87-7c2e07755190)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bpxcp_openshift-operators_5c19c761-fcc4-474d-9d87-7c2e07755190_0(d42c9535d1f12170421f8c4ad29abd3e8fcb8aedc994c0c92e17248dfc5f4fd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" podUID="5c19c761-fcc4-474d-9d87-7c2e07755190" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.820797 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.821422 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.821980 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.822343 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.822581 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:37 crc kubenswrapper[4739]: I1008 21:59:37.822787 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.859075 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(e6daadd14d670b2419342db439f0c7f9cc3e5671285f00499b7b3d6b54a177a6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.859135 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(e6daadd14d670b2419342db439f0c7f9cc3e5671285f00499b7b3d6b54a177a6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.859171 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(e6daadd14d670b2419342db439f0c7f9cc3e5671285f00499b7b3d6b54a177a6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.859226 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators(711de91c-2cc4-4161-ac30-0de8e68283d5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_openshift-operators_711de91c-2cc4-4161-ac30-0de8e68283d5_0(e6daadd14d670b2419342db439f0c7f9cc3e5671285f00499b7b3d6b54a177a6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" podUID="711de91c-2cc4-4161-ac30-0de8e68283d5" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.885824 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(cd2e5989da4e8cd832d91acecd3ea8efcceddfdae1b3619e7daf39cdb542f476): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.885899 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(cd2e5989da4e8cd832d91acecd3ea8efcceddfdae1b3619e7daf39cdb542f476): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.885925 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(cd2e5989da4e8cd832d91acecd3ea8efcceddfdae1b3619e7daf39cdb542f476): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.885976 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators(80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_openshift-operators_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2_0(cd2e5989da4e8cd832d91acecd3ea8efcceddfdae1b3619e7daf39cdb542f476): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" podUID="80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.891898 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(088c5995620bfe15da3e70efe6a4204a8b9b746092dc46537159289680df6b36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.891959 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(088c5995620bfe15da3e70efe6a4204a8b9b746092dc46537159289680df6b36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.891985 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(088c5995620bfe15da3e70efe6a4204a8b9b746092dc46537159289680df6b36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:37 crc kubenswrapper[4739]: E1008 21:59:37.892038 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators(bbafdc6e-b606-4274-aebb-eb1d38bf693e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-7fkfg_openshift-operators_bbafdc6e-b606-4274-aebb-eb1d38bf693e_0(088c5995620bfe15da3e70efe6a4204a8b9b746092dc46537159289680df6b36): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" podUID="bbafdc6e-b606-4274-aebb-eb1d38bf693e" Oct 08 21:59:38 crc kubenswrapper[4739]: I1008 21:59:38.821657 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:38 crc kubenswrapper[4739]: I1008 21:59:38.822179 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:38 crc kubenswrapper[4739]: E1008 21:59:38.859781 4739 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(d82f204bcbe6baf1da408c70af3c3bea823926a7fdca7c92006211916f1ecea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 21:59:38 crc kubenswrapper[4739]: E1008 21:59:38.859853 4739 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(d82f204bcbe6baf1da408c70af3c3bea823926a7fdca7c92006211916f1ecea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:38 crc kubenswrapper[4739]: E1008 21:59:38.859882 4739 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(d82f204bcbe6baf1da408c70af3c3bea823926a7fdca7c92006211916f1ecea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:38 crc kubenswrapper[4739]: E1008 21:59:38.859937 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-6kpbp_openshift-operators(4c13b744-b744-49f3-8ba5-241ab69fdab9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-6kpbp_openshift-operators_4c13b744-b744-49f3-8ba5-241ab69fdab9_0(d82f204bcbe6baf1da408c70af3c3bea823926a7fdca7c92006211916f1ecea3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" podUID="4c13b744-b744-49f3-8ba5-241ab69fdab9" Oct 08 21:59:43 crc kubenswrapper[4739]: I1008 21:59:43.821592 4739 scope.go:117] "RemoveContainer" containerID="93b79eb889387eed738d5f03a13377c9974599710eef3592e8e0024458f11d88" Oct 08 21:59:44 crc kubenswrapper[4739]: I1008 21:59:44.313539 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wwt88_17ed1d5a-5f21-4dcf-bdb9-09e715f57027/kube-multus/2.log" Oct 08 21:59:44 crc kubenswrapper[4739]: I1008 21:59:44.313941 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wwt88" event={"ID":"17ed1d5a-5f21-4dcf-bdb9-09e715f57027","Type":"ContainerStarted","Data":"1e8b8244e3653ebd81076d478236882c3d90c1b4cad654d2eed5673f0cf7b1a1"} Oct 08 21:59:47 crc kubenswrapper[4739]: I1008 21:59:47.834433 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9qjt" Oct 08 21:59:49 crc kubenswrapper[4739]: I1008 21:59:49.822054 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:49 crc kubenswrapper[4739]: I1008 21:59:49.822623 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.253194 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg"] Oct 08 21:59:50 crc kubenswrapper[4739]: W1008 21:59:50.260188 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbafdc6e_b606_4274_aebb_eb1d38bf693e.slice/crio-736109414a0083adeef8caf99509236be94bfeff55092c4f547f40ea5cb40a6b WatchSource:0}: Error finding container 736109414a0083adeef8caf99509236be94bfeff55092c4f547f40ea5cb40a6b: Status 404 returned error can't find the container with id 736109414a0083adeef8caf99509236be94bfeff55092c4f547f40ea5cb40a6b Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.342892 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" event={"ID":"bbafdc6e-b606-4274-aebb-eb1d38bf693e","Type":"ContainerStarted","Data":"736109414a0083adeef8caf99509236be94bfeff55092c4f547f40ea5cb40a6b"} Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.821622 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.821642 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.822254 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.822510 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" Oct 08 21:59:50 crc kubenswrapper[4739]: I1008 21:59:50.997915 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs"] Oct 08 21:59:51 crc kubenswrapper[4739]: W1008 21:59:51.013555 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711de91c_2cc4_4161_ac30_0de8e68283d5.slice/crio-1b691bbecff28e5a5c0cf48ae58cb0a535cd03d9f1b4cc43c140d76a2a5f2125 WatchSource:0}: Error finding container 1b691bbecff28e5a5c0cf48ae58cb0a535cd03d9f1b4cc43c140d76a2a5f2125: Status 404 returned error can't find the container with id 1b691bbecff28e5a5c0cf48ae58cb0a535cd03d9f1b4cc43c140d76a2a5f2125 Oct 08 21:59:51 crc kubenswrapper[4739]: I1008 21:59:51.074371 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bpxcp"] Oct 08 21:59:51 crc kubenswrapper[4739]: W1008 21:59:51.081005 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c19c761_fcc4_474d_9d87_7c2e07755190.slice/crio-961880a964bf1a52dac6fada624b48d4f7398e70785ad1785c0ce94d77ddba8f WatchSource:0}: Error finding container 961880a964bf1a52dac6fada624b48d4f7398e70785ad1785c0ce94d77ddba8f: Status 404 returned error can't find the container with id 961880a964bf1a52dac6fada624b48d4f7398e70785ad1785c0ce94d77ddba8f Oct 08 21:59:51 crc kubenswrapper[4739]: I1008 21:59:51.352742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" event={"ID":"5c19c761-fcc4-474d-9d87-7c2e07755190","Type":"ContainerStarted","Data":"961880a964bf1a52dac6fada624b48d4f7398e70785ad1785c0ce94d77ddba8f"} Oct 08 21:59:51 crc kubenswrapper[4739]: I1008 21:59:51.354737 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" event={"ID":"711de91c-2cc4-4161-ac30-0de8e68283d5","Type":"ContainerStarted","Data":"1b691bbecff28e5a5c0cf48ae58cb0a535cd03d9f1b4cc43c140d76a2a5f2125"} Oct 08 21:59:51 crc kubenswrapper[4739]: I1008 21:59:51.821114 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:51 crc kubenswrapper[4739]: I1008 21:59:51.833302 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" Oct 08 21:59:52 crc kubenswrapper[4739]: I1008 21:59:52.062569 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd"] Oct 08 21:59:52 crc kubenswrapper[4739]: W1008 21:59:52.069908 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80783ef0_8acc_4d99_bc23_6c6c7fbf2ee2.slice/crio-c740288c80983ed11c9757389cf2e09c3abe3c80223110d156cefb6a26c85c30 WatchSource:0}: Error finding container c740288c80983ed11c9757389cf2e09c3abe3c80223110d156cefb6a26c85c30: Status 404 returned error can't find the container with id c740288c80983ed11c9757389cf2e09c3abe3c80223110d156cefb6a26c85c30 Oct 08 21:59:52 crc kubenswrapper[4739]: I1008 21:59:52.362767 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" event={"ID":"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2","Type":"ContainerStarted","Data":"c740288c80983ed11c9757389cf2e09c3abe3c80223110d156cefb6a26c85c30"} Oct 08 21:59:52 crc kubenswrapper[4739]: I1008 21:59:52.821762 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:52 crc kubenswrapper[4739]: I1008 21:59:52.822999 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 21:59:53 crc kubenswrapper[4739]: I1008 21:59:53.098184 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-6kpbp"] Oct 08 21:59:53 crc kubenswrapper[4739]: I1008 21:59:53.373200 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" event={"ID":"4c13b744-b744-49f3-8ba5-241ab69fdab9","Type":"ContainerStarted","Data":"b099147e69c6bf81f4fd41545af24b7a9622e6e9fdb81125f0d4548657ca727e"} Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.158419 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl"] Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.160036 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.163424 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl"] Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.163476 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.163522 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.267877 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svr24\" (UniqueName: \"kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.267932 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.267967 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.368705 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svr24\" (UniqueName: \"kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.368757 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.368788 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.369580 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.389378 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.397581 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svr24\" (UniqueName: \"kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24\") pod \"collect-profiles-29332680-8c8vl\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:00 crc kubenswrapper[4739]: I1008 22:00:00.478362 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.015138 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl"] Oct 08 22:00:10 crc kubenswrapper[4739]: W1008 22:00:10.021853 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2991a8_b959_40eb_80d1_dc78dbe9767e.slice/crio-0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840 WatchSource:0}: Error finding container 0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840: Status 404 returned error can't find the container with id 0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840 Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.470312 4739 generic.go:334] "Generic (PLEG): container finished" podID="0f2991a8-b959-40eb-80d1-dc78dbe9767e" containerID="c1a7959ac37280608c279723f6e84c36ad9d97c6d581dc068563647139d4a20a" exitCode=0 Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.470414 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" event={"ID":"0f2991a8-b959-40eb-80d1-dc78dbe9767e","Type":"ContainerDied","Data":"c1a7959ac37280608c279723f6e84c36ad9d97c6d581dc068563647139d4a20a"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.470443 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" event={"ID":"0f2991a8-b959-40eb-80d1-dc78dbe9767e","Type":"ContainerStarted","Data":"0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.473464 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" event={"ID":"80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2","Type":"ContainerStarted","Data":"dbff9fe8d39d9aa96df048563e9455aab1e3e7ddaff4da86a4b1f8b6cbd7ecf2"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.476354 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" event={"ID":"711de91c-2cc4-4161-ac30-0de8e68283d5","Type":"ContainerStarted","Data":"3ab28f3b00c030aed039447d423695046f8cda879805639ddd94b3ee1951f91c"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.477747 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" event={"ID":"4c13b744-b744-49f3-8ba5-241ab69fdab9","Type":"ContainerStarted","Data":"5a7d176b700c1f7ec7fc9e02ea71573fff71ea103201591ce9cdf8732a87f8f1"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.477910 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.479420 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.479756 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" event={"ID":"bbafdc6e-b606-4274-aebb-eb1d38bf693e","Type":"ContainerStarted","Data":"4a9839404bfd839a6ce8682a575ae398e09fab871ea52d5bc82a302c547b4ee8"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.480822 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" event={"ID":"5c19c761-fcc4-474d-9d87-7c2e07755190","Type":"ContainerStarted","Data":"23cc229ad9d39c70f0f3e518fd68e33d35aeb47a31a19052b44c35c7e04f9154"} Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.481497 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.502808 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-7fkfg" podStartSLOduration=29.945331262 podStartE2EDuration="49.502792149s" podCreationTimestamp="2025-10-08 21:59:21 +0000 UTC" firstStartedPulling="2025-10-08 21:59:50.262632843 +0000 UTC m=+690.088018593" lastFinishedPulling="2025-10-08 22:00:09.82009372 +0000 UTC m=+709.645479480" observedRunningTime="2025-10-08 22:00:10.502495292 +0000 UTC m=+710.327881062" watchObservedRunningTime="2025-10-08 22:00:10.502792149 +0000 UTC m=+710.328177899" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.526594 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-6kpbp" podStartSLOduration=31.75528543 podStartE2EDuration="48.526577939s" podCreationTimestamp="2025-10-08 21:59:22 +0000 UTC" firstStartedPulling="2025-10-08 21:59:53.112131425 +0000 UTC m=+692.937517175" lastFinishedPulling="2025-10-08 22:00:09.883423934 +0000 UTC m=+709.708809684" observedRunningTime="2025-10-08 22:00:10.524914168 +0000 UTC m=+710.350299918" watchObservedRunningTime="2025-10-08 22:00:10.526577939 +0000 UTC m=+710.351963689" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.540974 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs" podStartSLOduration=30.666519989 podStartE2EDuration="49.54095543s" podCreationTimestamp="2025-10-08 21:59:21 +0000 UTC" firstStartedPulling="2025-10-08 21:59:51.018335522 +0000 UTC m=+690.843721272" lastFinishedPulling="2025-10-08 22:00:09.892770963 +0000 UTC m=+709.718156713" observedRunningTime="2025-10-08 22:00:10.540010507 +0000 UTC m=+710.365396267" watchObservedRunningTime="2025-10-08 22:00:10.54095543 +0000 UTC m=+710.366341180" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.595213 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" podStartSLOduration=29.859054224 podStartE2EDuration="48.595197982s" podCreationTimestamp="2025-10-08 21:59:22 +0000 UTC" firstStartedPulling="2025-10-08 21:59:51.085637083 +0000 UTC m=+690.911022833" lastFinishedPulling="2025-10-08 22:00:09.821780831 +0000 UTC m=+709.647166591" observedRunningTime="2025-10-08 22:00:10.593184943 +0000 UTC m=+710.418570693" watchObservedRunningTime="2025-10-08 22:00:10.595197982 +0000 UTC m=+710.420583732" Oct 08 22:00:10 crc kubenswrapper[4739]: I1008 22:00:10.595517 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd" podStartSLOduration=31.860406475 podStartE2EDuration="49.59551281s" podCreationTimestamp="2025-10-08 21:59:21 +0000 UTC" firstStartedPulling="2025-10-08 21:59:52.072829438 +0000 UTC m=+691.898215188" lastFinishedPulling="2025-10-08 22:00:09.807935753 +0000 UTC m=+709.633321523" observedRunningTime="2025-10-08 22:00:10.572504589 +0000 UTC m=+710.397890349" watchObservedRunningTime="2025-10-08 22:00:10.59551281 +0000 UTC m=+710.420898550" Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.763324 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.923863 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume\") pod \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.923934 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svr24\" (UniqueName: \"kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24\") pod \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.923978 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume\") pod \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\" (UID: \"0f2991a8-b959-40eb-80d1-dc78dbe9767e\") " Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.924618 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f2991a8-b959-40eb-80d1-dc78dbe9767e" (UID: "0f2991a8-b959-40eb-80d1-dc78dbe9767e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.929491 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24" (OuterVolumeSpecName: "kube-api-access-svr24") pod "0f2991a8-b959-40eb-80d1-dc78dbe9767e" (UID: "0f2991a8-b959-40eb-80d1-dc78dbe9767e"). InnerVolumeSpecName "kube-api-access-svr24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:00:11 crc kubenswrapper[4739]: I1008 22:00:11.929615 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f2991a8-b959-40eb-80d1-dc78dbe9767e" (UID: "0f2991a8-b959-40eb-80d1-dc78dbe9767e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.025397 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svr24\" (UniqueName: \"kubernetes.io/projected/0f2991a8-b959-40eb-80d1-dc78dbe9767e-kube-api-access-svr24\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.025663 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2991a8-b959-40eb-80d1-dc78dbe9767e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.025672 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2991a8-b959-40eb-80d1-dc78dbe9767e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.493096 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" event={"ID":"0f2991a8-b959-40eb-80d1-dc78dbe9767e","Type":"ContainerDied","Data":"0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840"} Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.493173 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8ddd3ff1d23c41ac0762455686b265b7f3f471de9d3cfad4c7e0db12909840" Oct 08 22:00:12 crc kubenswrapper[4739]: I1008 22:00:12.493383 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.105963 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qqgbk"] Oct 08 22:00:19 crc kubenswrapper[4739]: E1008 22:00:19.106484 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2991a8-b959-40eb-80d1-dc78dbe9767e" containerName="collect-profiles" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.106497 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2991a8-b959-40eb-80d1-dc78dbe9767e" containerName="collect-profiles" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.106600 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2991a8-b959-40eb-80d1-dc78dbe9767e" containerName="collect-profiles" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.106947 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.110812 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-69d28"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.111483 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-69d28" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.112003 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.112165 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.112431 4739 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hxt85" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.114673 4739 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-t88pf" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.122818 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qqgbk"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.131483 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-69d28"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.147906 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hmtjz"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.149700 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.153783 4739 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-kh6q7" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.174837 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hmtjz"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.216986 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7r64\" (UniqueName: \"kubernetes.io/projected/40bc8c1c-ef2f-4374-b80d-f402929336c3-kube-api-access-n7r64\") pod \"cert-manager-cainjector-7f985d654d-qqgbk\" (UID: \"40bc8c1c-ef2f-4374-b80d-f402929336c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.217041 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lr6\" (UniqueName: \"kubernetes.io/projected/1b95f6e0-c1ca-4d45-82df-49d302f081ec-kube-api-access-t6lr6\") pod \"cert-manager-5b446d88c5-69d28\" (UID: \"1b95f6e0-c1ca-4d45-82df-49d302f081ec\") " pod="cert-manager/cert-manager-5b446d88c5-69d28" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.319226 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7r64\" (UniqueName: \"kubernetes.io/projected/40bc8c1c-ef2f-4374-b80d-f402929336c3-kube-api-access-n7r64\") pod \"cert-manager-cainjector-7f985d654d-qqgbk\" (UID: \"40bc8c1c-ef2f-4374-b80d-f402929336c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.319300 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lr6\" (UniqueName: \"kubernetes.io/projected/1b95f6e0-c1ca-4d45-82df-49d302f081ec-kube-api-access-t6lr6\") pod \"cert-manager-5b446d88c5-69d28\" (UID: \"1b95f6e0-c1ca-4d45-82df-49d302f081ec\") " pod="cert-manager/cert-manager-5b446d88c5-69d28" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.319343 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmjv\" (UniqueName: \"kubernetes.io/projected/551e5827-2d65-48ab-90a0-6e46341e2292-kube-api-access-lhmjv\") pod \"cert-manager-webhook-5655c58dd6-hmtjz\" (UID: \"551e5827-2d65-48ab-90a0-6e46341e2292\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.338835 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7r64\" (UniqueName: \"kubernetes.io/projected/40bc8c1c-ef2f-4374-b80d-f402929336c3-kube-api-access-n7r64\") pod \"cert-manager-cainjector-7f985d654d-qqgbk\" (UID: \"40bc8c1c-ef2f-4374-b80d-f402929336c3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.339069 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lr6\" (UniqueName: \"kubernetes.io/projected/1b95f6e0-c1ca-4d45-82df-49d302f081ec-kube-api-access-t6lr6\") pod \"cert-manager-5b446d88c5-69d28\" (UID: \"1b95f6e0-c1ca-4d45-82df-49d302f081ec\") " pod="cert-manager/cert-manager-5b446d88c5-69d28" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.420134 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmjv\" (UniqueName: \"kubernetes.io/projected/551e5827-2d65-48ab-90a0-6e46341e2292-kube-api-access-lhmjv\") pod \"cert-manager-webhook-5655c58dd6-hmtjz\" (UID: \"551e5827-2d65-48ab-90a0-6e46341e2292\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.439865 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmjv\" (UniqueName: \"kubernetes.io/projected/551e5827-2d65-48ab-90a0-6e46341e2292-kube-api-access-lhmjv\") pod \"cert-manager-webhook-5655c58dd6-hmtjz\" (UID: \"551e5827-2d65-48ab-90a0-6e46341e2292\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.442786 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.457573 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-69d28" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.477251 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.714350 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-69d28"] Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.965742 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hmtjz"] Oct 08 22:00:19 crc kubenswrapper[4739]: W1008 22:00:19.967746 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551e5827_2d65_48ab_90a0_6e46341e2292.slice/crio-a85a986cc27fb317ca5805428d0622ac795e9bfb8250f9b338fe0940a73a5c13 WatchSource:0}: Error finding container a85a986cc27fb317ca5805428d0622ac795e9bfb8250f9b338fe0940a73a5c13: Status 404 returned error can't find the container with id a85a986cc27fb317ca5805428d0622ac795e9bfb8250f9b338fe0940a73a5c13 Oct 08 22:00:19 crc kubenswrapper[4739]: I1008 22:00:19.969449 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qqgbk"] Oct 08 22:00:19 crc kubenswrapper[4739]: W1008 22:00:19.972504 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40bc8c1c_ef2f_4374_b80d_f402929336c3.slice/crio-a1736a4c2a98e7614dec0ed03cc40aa72f50302170b188dee5882e21125fbe21 WatchSource:0}: Error finding container a1736a4c2a98e7614dec0ed03cc40aa72f50302170b188dee5882e21125fbe21: Status 404 returned error can't find the container with id a1736a4c2a98e7614dec0ed03cc40aa72f50302170b188dee5882e21125fbe21 Oct 08 22:00:20 crc kubenswrapper[4739]: I1008 22:00:20.554294 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" event={"ID":"551e5827-2d65-48ab-90a0-6e46341e2292","Type":"ContainerStarted","Data":"a85a986cc27fb317ca5805428d0622ac795e9bfb8250f9b338fe0940a73a5c13"} Oct 08 22:00:20 crc kubenswrapper[4739]: I1008 22:00:20.555745 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-69d28" event={"ID":"1b95f6e0-c1ca-4d45-82df-49d302f081ec","Type":"ContainerStarted","Data":"dd4e2bcf462ebaebd753ad1b1f400265e5773b66928e27cf406034377aee9ca3"} Oct 08 22:00:20 crc kubenswrapper[4739]: I1008 22:00:20.557086 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" event={"ID":"40bc8c1c-ef2f-4374-b80d-f402929336c3","Type":"ContainerStarted","Data":"a1736a4c2a98e7614dec0ed03cc40aa72f50302170b188dee5882e21125fbe21"} Oct 08 22:00:22 crc kubenswrapper[4739]: I1008 22:00:22.575712 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-bpxcp" Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.606851 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-69d28" event={"ID":"1b95f6e0-c1ca-4d45-82df-49d302f081ec","Type":"ContainerStarted","Data":"6a2c7c2b25cf3ccc6c2b2c7f18b1e14d0384ab5e48c767d51a6a5576b460c486"} Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.608116 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" event={"ID":"40bc8c1c-ef2f-4374-b80d-f402929336c3","Type":"ContainerStarted","Data":"586913343b3cb73727706a623f8317554f3fd959bbec80e111debbe090d22601"} Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.609587 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" event={"ID":"551e5827-2d65-48ab-90a0-6e46341e2292","Type":"ContainerStarted","Data":"a00e24b1eff85852b26465045fe33b911edee183c2f867f46db3a72764565704"} Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.609726 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.669525 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" podStartSLOduration=1.8177260990000002 podStartE2EDuration="4.66950967s" podCreationTimestamp="2025-10-08 22:00:19 +0000 UTC" firstStartedPulling="2025-10-08 22:00:19.96940905 +0000 UTC m=+719.794794800" lastFinishedPulling="2025-10-08 22:00:22.821192621 +0000 UTC m=+722.646578371" observedRunningTime="2025-10-08 22:00:23.668638359 +0000 UTC m=+723.494024109" watchObservedRunningTime="2025-10-08 22:00:23.66950967 +0000 UTC m=+723.494895410" Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.670741 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-69d28" podStartSLOduration=2.102574356 podStartE2EDuration="4.67073393s" podCreationTimestamp="2025-10-08 22:00:19 +0000 UTC" firstStartedPulling="2025-10-08 22:00:19.733175628 +0000 UTC m=+719.558561378" lastFinishedPulling="2025-10-08 22:00:22.301335202 +0000 UTC m=+722.126720952" observedRunningTime="2025-10-08 22:00:23.635620004 +0000 UTC m=+723.461005754" watchObservedRunningTime="2025-10-08 22:00:23.67073393 +0000 UTC m=+723.496119680" Oct 08 22:00:23 crc kubenswrapper[4739]: I1008 22:00:23.697349 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qqgbk" podStartSLOduration=1.789979462 podStartE2EDuration="4.697332529s" podCreationTimestamp="2025-10-08 22:00:19 +0000 UTC" firstStartedPulling="2025-10-08 22:00:19.974270118 +0000 UTC m=+719.799655868" lastFinishedPulling="2025-10-08 22:00:22.881623185 +0000 UTC m=+722.707008935" observedRunningTime="2025-10-08 22:00:23.695422803 +0000 UTC m=+723.520808553" watchObservedRunningTime="2025-10-08 22:00:23.697332529 +0000 UTC m=+723.522718279" Oct 08 22:00:29 crc kubenswrapper[4739]: I1008 22:00:29.480759 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-hmtjz" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.191383 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.194396 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerName="controller-manager" containerID="cri-o://5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a" gracePeriod=30 Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.281588 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.282403 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" podUID="a2b52465-5f11-4296-87b3-9254f036358f" containerName="route-controller-manager" containerID="cri-o://85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5" gracePeriod=30 Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.533759 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.599622 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.686287 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert\") pod \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.686599 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4dg\" (UniqueName: \"kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg\") pod \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.686619 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles\") pod \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.686643 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca\") pod \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.686685 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config\") pod \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\" (UID: \"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.687460 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" (UID: "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.687509 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config" (OuterVolumeSpecName: "config") pod "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" (UID: "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.688006 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" (UID: "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.691939 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg" (OuterVolumeSpecName: "kube-api-access-8k4dg") pod "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" (UID: "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05"). InnerVolumeSpecName "kube-api-access-8k4dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.691930 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" (UID: "bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787374 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca\") pod \"a2b52465-5f11-4296-87b3-9254f036358f\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787456 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwl5j\" (UniqueName: \"kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j\") pod \"a2b52465-5f11-4296-87b3-9254f036358f\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787504 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config\") pod \"a2b52465-5f11-4296-87b3-9254f036358f\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787524 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert\") pod \"a2b52465-5f11-4296-87b3-9254f036358f\" (UID: \"a2b52465-5f11-4296-87b3-9254f036358f\") " Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787709 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4dg\" (UniqueName: \"kubernetes.io/projected/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-kube-api-access-8k4dg\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787729 4739 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787741 4739 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787751 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.787759 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.788708 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config" (OuterVolumeSpecName: "config") pod "a2b52465-5f11-4296-87b3-9254f036358f" (UID: "a2b52465-5f11-4296-87b3-9254f036358f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.788782 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2b52465-5f11-4296-87b3-9254f036358f" (UID: "a2b52465-5f11-4296-87b3-9254f036358f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.790970 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j" (OuterVolumeSpecName: "kube-api-access-vwl5j") pod "a2b52465-5f11-4296-87b3-9254f036358f" (UID: "a2b52465-5f11-4296-87b3-9254f036358f"). InnerVolumeSpecName "kube-api-access-vwl5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.791226 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2b52465-5f11-4296-87b3-9254f036358f" (UID: "a2b52465-5f11-4296-87b3-9254f036358f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.804979 4739 generic.go:334] "Generic (PLEG): container finished" podID="a2b52465-5f11-4296-87b3-9254f036358f" containerID="85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5" exitCode=0 Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.805014 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" event={"ID":"a2b52465-5f11-4296-87b3-9254f036358f","Type":"ContainerDied","Data":"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5"} Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.805061 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" event={"ID":"a2b52465-5f11-4296-87b3-9254f036358f","Type":"ContainerDied","Data":"5244171ab73ceca288d5da2f86389cf9bac32400a26ced68b725acc69ec01129"} Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.805029 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.805083 4739 scope.go:117] "RemoveContainer" containerID="85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.808703 4739 generic.go:334] "Generic (PLEG): container finished" podID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerID="5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a" exitCode=0 Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.808729 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.808745 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" event={"ID":"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05","Type":"ContainerDied","Data":"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a"} Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.808772 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dq28h" event={"ID":"bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05","Type":"ContainerDied","Data":"c0d5e19729c154f130360d4cbb454eed8134e0db9e0803b11afa08ac5670f2ec"} Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.831331 4739 scope.go:117] "RemoveContainer" containerID="85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5" Oct 08 22:00:56 crc kubenswrapper[4739]: E1008 22:00:56.833705 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5\": container with ID starting with 85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5 not found: ID does not exist" containerID="85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.833769 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5"} err="failed to get container status \"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5\": rpc error: code = NotFound desc = could not find container \"85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5\": container with ID starting with 85f487ef36ac6de67dc94a7fa50b74c42a000a4d6e318c4d233116d7c7c795b5 not found: ID does not exist" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.833809 4739 scope.go:117] "RemoveContainer" containerID="5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.849997 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.856125 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dq28h"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.856133 4739 scope.go:117] "RemoveContainer" containerID="5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a" Oct 08 22:00:56 crc kubenswrapper[4739]: E1008 22:00:56.856698 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a\": container with ID starting with 5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a not found: ID does not exist" containerID="5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.856737 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a"} err="failed to get container status \"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a\": rpc error: code = NotFound desc = could not find container \"5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a\": container with ID starting with 5f0da66abcc67138ba3c7c34f75b8cd8cf8ea2170040f3f5d1607cc06db8c22a not found: ID does not exist" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.862046 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.865752 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8smnx"] Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.888883 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwl5j\" (UniqueName: \"kubernetes.io/projected/a2b52465-5f11-4296-87b3-9254f036358f-kube-api-access-vwl5j\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.888963 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.888984 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b52465-5f11-4296-87b3-9254f036358f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:56 crc kubenswrapper[4739]: I1008 22:00:56.889005 4739 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2b52465-5f11-4296-87b3-9254f036358f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:57 crc kubenswrapper[4739]: I1008 22:00:57.829449 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b52465-5f11-4296-87b3-9254f036358f" path="/var/lib/kubelet/pods/a2b52465-5f11-4296-87b3-9254f036358f/volumes" Oct 08 22:00:57 crc kubenswrapper[4739]: I1008 22:00:57.830345 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" path="/var/lib/kubelet/pods/bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05/volumes" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.260441 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw"] Oct 08 22:00:58 crc kubenswrapper[4739]: E1008 22:00:58.261012 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerName="controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.261029 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerName="controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: E1008 22:00:58.261058 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b52465-5f11-4296-87b3-9254f036358f" containerName="route-controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.261066 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b52465-5f11-4296-87b3-9254f036358f" containerName="route-controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.261201 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6a9ed0-3db4-44cf-aa70-1933ae5f7e05" containerName="controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.261222 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b52465-5f11-4296-87b3-9254f036358f" containerName="route-controller-manager" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.261684 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.264493 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.264926 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.264951 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.264951 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.265081 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.265892 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.267762 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76846fbf8-g6fg4"] Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.268954 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.272259 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.272539 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.275678 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.276958 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.276971 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.278423 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.279115 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76846fbf8-g6fg4"] Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.283635 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw"] Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.288220 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.355686 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw"] Oct 08 22:00:58 crc kubenswrapper[4739]: E1008 22:00:58.356046 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-j445w serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" podUID="a830ecca-088f-4bfe-bd7c-14fa084067ec" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410128 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410206 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-config\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410242 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j445w\" (UniqueName: \"kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410338 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-proxy-ca-bundles\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410376 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4d5d2-7534-45be-829b-ee9241d7cac6-serving-cert\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410396 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-client-ca\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410444 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410506 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.410722 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxgg\" (UniqueName: \"kubernetes.io/projected/4cb4d5d2-7534-45be-829b-ee9241d7cac6-kube-api-access-rfxgg\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512221 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512264 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512313 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfxgg\" (UniqueName: \"kubernetes.io/projected/4cb4d5d2-7534-45be-829b-ee9241d7cac6-kube-api-access-rfxgg\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512357 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512376 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-config\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512417 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j445w\" (UniqueName: \"kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512439 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-proxy-ca-bundles\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512457 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-client-ca\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.512472 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4d5d2-7534-45be-829b-ee9241d7cac6-serving-cert\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.513507 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.513926 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.514003 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-proxy-ca-bundles\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.514538 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-config\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.514634 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb4d5d2-7534-45be-829b-ee9241d7cac6-client-ca\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.519603 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cb4d5d2-7534-45be-829b-ee9241d7cac6-serving-cert\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.523743 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.528593 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfxgg\" (UniqueName: \"kubernetes.io/projected/4cb4d5d2-7534-45be-829b-ee9241d7cac6-kube-api-access-rfxgg\") pod \"controller-manager-76846fbf8-g6fg4\" (UID: \"4cb4d5d2-7534-45be-829b-ee9241d7cac6\") " pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.541474 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j445w\" (UniqueName: \"kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w\") pod \"route-controller-manager-56d58df8cf-knfmw\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.608602 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.816677 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76846fbf8-g6fg4"] Oct 08 22:00:58 crc kubenswrapper[4739]: W1008 22:00:58.823974 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cb4d5d2_7534_45be_829b_ee9241d7cac6.slice/crio-d45af0b63d5130ac86ab8fc73245340332a7a55ac6b48d292bc1b1f90982b580 WatchSource:0}: Error finding container d45af0b63d5130ac86ab8fc73245340332a7a55ac6b48d292bc1b1f90982b580: Status 404 returned error can't find the container with id d45af0b63d5130ac86ab8fc73245340332a7a55ac6b48d292bc1b1f90982b580 Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.826848 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:58 crc kubenswrapper[4739]: I1008 22:00:58.839519 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.018327 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j445w\" (UniqueName: \"kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w\") pod \"a830ecca-088f-4bfe-bd7c-14fa084067ec\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.018384 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca\") pod \"a830ecca-088f-4bfe-bd7c-14fa084067ec\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.018443 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config\") pod \"a830ecca-088f-4bfe-bd7c-14fa084067ec\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.018478 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert\") pod \"a830ecca-088f-4bfe-bd7c-14fa084067ec\" (UID: \"a830ecca-088f-4bfe-bd7c-14fa084067ec\") " Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.018898 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "a830ecca-088f-4bfe-bd7c-14fa084067ec" (UID: "a830ecca-088f-4bfe-bd7c-14fa084067ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.019084 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config" (OuterVolumeSpecName: "config") pod "a830ecca-088f-4bfe-bd7c-14fa084067ec" (UID: "a830ecca-088f-4bfe-bd7c-14fa084067ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.023356 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w" (OuterVolumeSpecName: "kube-api-access-j445w") pod "a830ecca-088f-4bfe-bd7c-14fa084067ec" (UID: "a830ecca-088f-4bfe-bd7c-14fa084067ec"). InnerVolumeSpecName "kube-api-access-j445w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.024515 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a830ecca-088f-4bfe-bd7c-14fa084067ec" (UID: "a830ecca-088f-4bfe-bd7c-14fa084067ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.119520 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.119899 4739 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a830ecca-088f-4bfe-bd7c-14fa084067ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.119913 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j445w\" (UniqueName: \"kubernetes.io/projected/a830ecca-088f-4bfe-bd7c-14fa084067ec-kube-api-access-j445w\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.119924 4739 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a830ecca-088f-4bfe-bd7c-14fa084067ec-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.857261 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.857298 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" event={"ID":"4cb4d5d2-7534-45be-829b-ee9241d7cac6","Type":"ContainerStarted","Data":"4405de1e9ede28f8086d9b92c0a8a3b3c11848f5376b6da9335c2935f4b37057"} Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.858717 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" event={"ID":"4cb4d5d2-7534-45be-829b-ee9241d7cac6","Type":"ContainerStarted","Data":"d45af0b63d5130ac86ab8fc73245340332a7a55ac6b48d292bc1b1f90982b580"} Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.858745 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.869085 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.899919 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76846fbf8-g6fg4" podStartSLOduration=3.899906655 podStartE2EDuration="3.899906655s" podCreationTimestamp="2025-10-08 22:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:00:59.896008811 +0000 UTC m=+759.721394561" watchObservedRunningTime="2025-10-08 22:00:59.899906655 +0000 UTC m=+759.725292405" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.952190 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9"] Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.952898 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961280 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961414 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961499 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961501 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961610 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.961660 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.964744 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw"] Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.972687 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9"] Oct 08 22:00:59 crc kubenswrapper[4739]: I1008 22:00:59.980136 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d58df8cf-knfmw"] Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.129628 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-client-ca\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.130465 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-config\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.130523 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-serving-cert\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.130552 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2lvc\" (UniqueName: \"kubernetes.io/projected/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-kube-api-access-p2lvc\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.231935 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-client-ca\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.231982 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-config\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.232011 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-serving-cert\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.232033 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2lvc\" (UniqueName: \"kubernetes.io/projected/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-kube-api-access-p2lvc\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.233347 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-client-ca\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.236303 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-config\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.240021 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-serving-cert\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.260981 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2lvc\" (UniqueName: \"kubernetes.io/projected/a89e0b20-76b3-453c-a50a-2dbfaaa213e1-kube-api-access-p2lvc\") pod \"route-controller-manager-ddc48f4cb-tbfs9\" (UID: \"a89e0b20-76b3-453c-a50a-2dbfaaa213e1\") " pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.269222 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.495471 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9"] Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.863695 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" event={"ID":"a89e0b20-76b3-453c-a50a-2dbfaaa213e1","Type":"ContainerStarted","Data":"b36783b4edb5b4367870012c7ea224ccc60cd87de2af4894b9583bd1fa701b94"} Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.864183 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" event={"ID":"a89e0b20-76b3-453c-a50a-2dbfaaa213e1","Type":"ContainerStarted","Data":"e09840eb0b701c50225a9539ee8df3025cc0c681cc347e2290ae40f6f07c20d3"} Oct 08 22:01:00 crc kubenswrapper[4739]: I1008 22:01:00.885043 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" podStartSLOduration=2.885025521 podStartE2EDuration="2.885025521s" podCreationTimestamp="2025-10-08 22:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:01:00.88250229 +0000 UTC m=+760.707888040" watchObservedRunningTime="2025-10-08 22:01:00.885025521 +0000 UTC m=+760.710411271" Oct 08 22:01:01 crc kubenswrapper[4739]: I1008 22:01:01.829199 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a830ecca-088f-4bfe-bd7c-14fa084067ec" path="/var/lib/kubelet/pods/a830ecca-088f-4bfe-bd7c-14fa084067ec/volumes" Oct 08 22:01:01 crc kubenswrapper[4739]: I1008 22:01:01.870354 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:01 crc kubenswrapper[4739]: I1008 22:01:01.875955 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ddc48f4cb-tbfs9" Oct 08 22:01:06 crc kubenswrapper[4739]: I1008 22:01:06.872300 4739 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.605450 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2"] Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.606861 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.616324 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2"] Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.621567 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.773747 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.773808 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxk4s\" (UniqueName: \"kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.774089 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.875741 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.875810 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.875834 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxk4s\" (UniqueName: \"kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.876516 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.876729 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.896307 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxk4s\" (UniqueName: \"kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:10 crc kubenswrapper[4739]: I1008 22:01:10.935168 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:11 crc kubenswrapper[4739]: I1008 22:01:11.410379 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2"] Oct 08 22:01:11 crc kubenswrapper[4739]: W1008 22:01:11.423137 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491673c0_7c8a_4f56_95c4_c06e79a87512.slice/crio-320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a WatchSource:0}: Error finding container 320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a: Status 404 returned error can't find the container with id 320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a Oct 08 22:01:11 crc kubenswrapper[4739]: E1008 22:01:11.647621 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491673c0_7c8a_4f56_95c4_c06e79a87512.slice/crio-conmon-6c1acb092dae9f9338e04afa5a6bbad93d03d37c14856de019416184b9bbfccf.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:01:11 crc kubenswrapper[4739]: I1008 22:01:11.938379 4739 generic.go:334] "Generic (PLEG): container finished" podID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerID="6c1acb092dae9f9338e04afa5a6bbad93d03d37c14856de019416184b9bbfccf" exitCode=0 Oct 08 22:01:11 crc kubenswrapper[4739]: I1008 22:01:11.938428 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" event={"ID":"491673c0-7c8a-4f56-95c4-c06e79a87512","Type":"ContainerDied","Data":"6c1acb092dae9f9338e04afa5a6bbad93d03d37c14856de019416184b9bbfccf"} Oct 08 22:01:11 crc kubenswrapper[4739]: I1008 22:01:11.938458 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" event={"ID":"491673c0-7c8a-4f56-95c4-c06e79a87512","Type":"ContainerStarted","Data":"320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a"} Oct 08 22:01:12 crc kubenswrapper[4739]: I1008 22:01:12.972872 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:12 crc kubenswrapper[4739]: I1008 22:01:12.975374 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.033361 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.121290 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.121388 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzfxb\" (UniqueName: \"kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.121532 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.222482 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.222564 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzfxb\" (UniqueName: \"kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.222599 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.223084 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.223173 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.244515 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzfxb\" (UniqueName: \"kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb\") pod \"redhat-operators-9x4km\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.300124 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.781722 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:13 crc kubenswrapper[4739]: W1008 22:01:13.791713 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda98c78af_858f_4d7b_92a3_17f57764c156.slice/crio-930fdb2bc704bad98713986a3a6faa0ce4c603dcb80241b41bbf42159b254bc4 WatchSource:0}: Error finding container 930fdb2bc704bad98713986a3a6faa0ce4c603dcb80241b41bbf42159b254bc4: Status 404 returned error can't find the container with id 930fdb2bc704bad98713986a3a6faa0ce4c603dcb80241b41bbf42159b254bc4 Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.950503 4739 generic.go:334] "Generic (PLEG): container finished" podID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerID="f09094b8c5b545fe276dc6e3b125b8cef48c1961b3dd2bd96f0fb4d204859075" exitCode=0 Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.950565 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" event={"ID":"491673c0-7c8a-4f56-95c4-c06e79a87512","Type":"ContainerDied","Data":"f09094b8c5b545fe276dc6e3b125b8cef48c1961b3dd2bd96f0fb4d204859075"} Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.954124 4739 generic.go:334] "Generic (PLEG): container finished" podID="a98c78af-858f-4d7b-92a3-17f57764c156" containerID="20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c" exitCode=0 Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.954182 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerDied","Data":"20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c"} Oct 08 22:01:13 crc kubenswrapper[4739]: I1008 22:01:13.954207 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerStarted","Data":"930fdb2bc704bad98713986a3a6faa0ce4c603dcb80241b41bbf42159b254bc4"} Oct 08 22:01:14 crc kubenswrapper[4739]: I1008 22:01:14.965793 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerStarted","Data":"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695"} Oct 08 22:01:14 crc kubenswrapper[4739]: I1008 22:01:14.969639 4739 generic.go:334] "Generic (PLEG): container finished" podID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerID="38d58556c58184c43157e0a121a92d9ee3ebc7945eb9095cf6176563c23c9593" exitCode=0 Oct 08 22:01:14 crc kubenswrapper[4739]: I1008 22:01:14.969684 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" event={"ID":"491673c0-7c8a-4f56-95c4-c06e79a87512","Type":"ContainerDied","Data":"38d58556c58184c43157e0a121a92d9ee3ebc7945eb9095cf6176563c23c9593"} Oct 08 22:01:15 crc kubenswrapper[4739]: I1008 22:01:15.976697 4739 generic.go:334] "Generic (PLEG): container finished" podID="a98c78af-858f-4d7b-92a3-17f57764c156" containerID="fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695" exitCode=0 Oct 08 22:01:15 crc kubenswrapper[4739]: I1008 22:01:15.976788 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerDied","Data":"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695"} Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.380079 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.566746 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util\") pod \"491673c0-7c8a-4f56-95c4-c06e79a87512\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.566880 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxk4s\" (UniqueName: \"kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s\") pod \"491673c0-7c8a-4f56-95c4-c06e79a87512\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.566953 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle\") pod \"491673c0-7c8a-4f56-95c4-c06e79a87512\" (UID: \"491673c0-7c8a-4f56-95c4-c06e79a87512\") " Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.568955 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle" (OuterVolumeSpecName: "bundle") pod "491673c0-7c8a-4f56-95c4-c06e79a87512" (UID: "491673c0-7c8a-4f56-95c4-c06e79a87512"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.574368 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s" (OuterVolumeSpecName: "kube-api-access-lxk4s") pod "491673c0-7c8a-4f56-95c4-c06e79a87512" (UID: "491673c0-7c8a-4f56-95c4-c06e79a87512"). InnerVolumeSpecName "kube-api-access-lxk4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.582676 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util" (OuterVolumeSpecName: "util") pod "491673c0-7c8a-4f56-95c4-c06e79a87512" (UID: "491673c0-7c8a-4f56-95c4-c06e79a87512"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.668292 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxk4s\" (UniqueName: \"kubernetes.io/projected/491673c0-7c8a-4f56-95c4-c06e79a87512-kube-api-access-lxk4s\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.668319 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.668329 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/491673c0-7c8a-4f56-95c4-c06e79a87512-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.989247 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerStarted","Data":"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec"} Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.992684 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" event={"ID":"491673c0-7c8a-4f56-95c4-c06e79a87512","Type":"ContainerDied","Data":"320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a"} Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.992753 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320b0c38997e8ca52c60a2a38fad24b72cefcd88f57c5a7daf66469f956e339a" Oct 08 22:01:16 crc kubenswrapper[4739]: I1008 22:01:16.992901 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2" Oct 08 22:01:17 crc kubenswrapper[4739]: I1008 22:01:17.018068 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9x4km" podStartSLOduration=2.5942515 podStartE2EDuration="5.018046433s" podCreationTimestamp="2025-10-08 22:01:12 +0000 UTC" firstStartedPulling="2025-10-08 22:01:13.964087191 +0000 UTC m=+773.789472941" lastFinishedPulling="2025-10-08 22:01:16.387882124 +0000 UTC m=+776.213267874" observedRunningTime="2025-10-08 22:01:17.013746478 +0000 UTC m=+776.839132238" watchObservedRunningTime="2025-10-08 22:01:17.018046433 +0000 UTC m=+776.843432183" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.519287 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh"] Oct 08 22:01:19 crc kubenswrapper[4739]: E1008 22:01:19.519614 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="util" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.519633 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="util" Oct 08 22:01:19 crc kubenswrapper[4739]: E1008 22:01:19.519647 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="extract" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.519659 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="extract" Oct 08 22:01:19 crc kubenswrapper[4739]: E1008 22:01:19.519677 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="pull" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.519688 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="pull" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.519850 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="491673c0-7c8a-4f56-95c4-c06e79a87512" containerName="extract" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.520509 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.522066 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rmsfs" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.522321 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.522689 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.530754 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh"] Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.715377 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzp9\" (UniqueName: \"kubernetes.io/projected/d8c93273-ced1-4664-9585-47ce49a29326-kube-api-access-mwzp9\") pod \"nmstate-operator-858ddd8f98-mt6zh\" (UID: \"d8c93273-ced1-4664-9585-47ce49a29326\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.817320 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzp9\" (UniqueName: \"kubernetes.io/projected/d8c93273-ced1-4664-9585-47ce49a29326-kube-api-access-mwzp9\") pod \"nmstate-operator-858ddd8f98-mt6zh\" (UID: \"d8c93273-ced1-4664-9585-47ce49a29326\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.834947 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzp9\" (UniqueName: \"kubernetes.io/projected/d8c93273-ced1-4664-9585-47ce49a29326-kube-api-access-mwzp9\") pod \"nmstate-operator-858ddd8f98-mt6zh\" (UID: \"d8c93273-ced1-4664-9585-47ce49a29326\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" Oct 08 22:01:19 crc kubenswrapper[4739]: I1008 22:01:19.838438 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" Oct 08 22:01:20 crc kubenswrapper[4739]: I1008 22:01:20.236783 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh"] Oct 08 22:01:20 crc kubenswrapper[4739]: W1008 22:01:20.242695 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c93273_ced1_4664_9585_47ce49a29326.slice/crio-11b618cc2cbbe40ed09b15a34d3dfef46c842542ed42a597e4fd1044a7c97cba WatchSource:0}: Error finding container 11b618cc2cbbe40ed09b15a34d3dfef46c842542ed42a597e4fd1044a7c97cba: Status 404 returned error can't find the container with id 11b618cc2cbbe40ed09b15a34d3dfef46c842542ed42a597e4fd1044a7c97cba Oct 08 22:01:21 crc kubenswrapper[4739]: I1008 22:01:21.025941 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" event={"ID":"d8c93273-ced1-4664-9585-47ce49a29326","Type":"ContainerStarted","Data":"11b618cc2cbbe40ed09b15a34d3dfef46c842542ed42a597e4fd1044a7c97cba"} Oct 08 22:01:21 crc kubenswrapper[4739]: I1008 22:01:21.766579 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:01:21 crc kubenswrapper[4739]: I1008 22:01:21.766820 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:01:23 crc kubenswrapper[4739]: I1008 22:01:23.301070 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:23 crc kubenswrapper[4739]: I1008 22:01:23.301391 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:23 crc kubenswrapper[4739]: I1008 22:01:23.344983 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:24 crc kubenswrapper[4739]: I1008 22:01:24.052891 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" event={"ID":"d8c93273-ced1-4664-9585-47ce49a29326","Type":"ContainerStarted","Data":"0c927737a2013cd67fc578bbe34a712eee20d085d8cbe5c6bf2831b03585c031"} Oct 08 22:01:24 crc kubenswrapper[4739]: I1008 22:01:24.073099 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mt6zh" podStartSLOduration=1.8472262590000001 podStartE2EDuration="5.073076036s" podCreationTimestamp="2025-10-08 22:01:19 +0000 UTC" firstStartedPulling="2025-10-08 22:01:20.244990534 +0000 UTC m=+780.070376284" lastFinishedPulling="2025-10-08 22:01:23.470840311 +0000 UTC m=+783.296226061" observedRunningTime="2025-10-08 22:01:24.071938388 +0000 UTC m=+783.897324158" watchObservedRunningTime="2025-10-08 22:01:24.073076036 +0000 UTC m=+783.898461806" Oct 08 22:01:24 crc kubenswrapper[4739]: I1008 22:01:24.124127 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.063482 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.065663 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.067975 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wl4cn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.076591 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.093402 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccvt\" (UniqueName: \"kubernetes.io/projected/ad883073-96a0-4558-9517-2f59f2e1472e-kube-api-access-bccvt\") pod \"nmstate-metrics-fdff9cb8d-pz8z2\" (UID: \"ad883073-96a0-4558-9517-2f59f2e1472e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.105538 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.106345 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.112675 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.122505 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lj8qk"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.123296 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.129322 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.189817 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.191506 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194019 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-z8sdt" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194261 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194391 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-dbus-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194443 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d26121b1-736e-49d5-9241-bd8e8e7706c5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194466 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-nmstate-lock\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194501 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccvt\" (UniqueName: \"kubernetes.io/projected/ad883073-96a0-4558-9517-2f59f2e1472e-kube-api-access-bccvt\") pod \"nmstate-metrics-fdff9cb8d-pz8z2\" (UID: \"ad883073-96a0-4558-9517-2f59f2e1472e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194532 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv65j\" (UniqueName: \"kubernetes.io/projected/d26121b1-736e-49d5-9241-bd8e8e7706c5-kube-api-access-pv65j\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194551 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-ovs-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194570 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194590 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwtc9\" (UniqueName: \"kubernetes.io/projected/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-kube-api-access-zwtc9\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194612 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.194635 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vlbl\" (UniqueName: \"kubernetes.io/projected/64c2a4a6-267c-484e-b36f-95d7540531ef-kube-api-access-9vlbl\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.196349 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.238786 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.272830 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccvt\" (UniqueName: \"kubernetes.io/projected/ad883073-96a0-4558-9517-2f59f2e1472e-kube-api-access-bccvt\") pod \"nmstate-metrics-fdff9cb8d-pz8z2\" (UID: \"ad883073-96a0-4558-9517-2f59f2e1472e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295438 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv65j\" (UniqueName: \"kubernetes.io/projected/d26121b1-736e-49d5-9241-bd8e8e7706c5-kube-api-access-pv65j\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295497 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-ovs-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295516 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295532 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwtc9\" (UniqueName: \"kubernetes.io/projected/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-kube-api-access-zwtc9\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295559 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295590 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vlbl\" (UniqueName: \"kubernetes.io/projected/64c2a4a6-267c-484e-b36f-95d7540531ef-kube-api-access-9vlbl\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295606 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-dbus-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295630 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d26121b1-736e-49d5-9241-bd8e8e7706c5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295654 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-nmstate-lock\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.295750 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-nmstate-lock\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.296581 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-ovs-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: E1008 22:01:25.296650 4739 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 08 22:01:25 crc kubenswrapper[4739]: E1008 22:01:25.296700 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert podName:d26121b1-736e-49d5-9241-bd8e8e7706c5 nodeName:}" failed. No retries permitted until 2025-10-08 22:01:25.796681696 +0000 UTC m=+785.622067446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-fcg68" (UID: "d26121b1-736e-49d5-9241-bd8e8e7706c5") : secret "plugin-serving-cert" not found Oct 08 22:01:25 crc kubenswrapper[4739]: E1008 22:01:25.297003 4739 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 08 22:01:25 crc kubenswrapper[4739]: E1008 22:01:25.297038 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair podName:dd0f36c5-926b-4678-b2d0-342a3f2f1d1f nodeName:}" failed. No retries permitted until 2025-10-08 22:01:25.797028734 +0000 UTC m=+785.622414484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair") pod "nmstate-webhook-6cdbc54649-pjkfn" (UID: "dd0f36c5-926b-4678-b2d0-342a3f2f1d1f") : secret "openshift-nmstate-webhook" not found Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.298379 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d26121b1-736e-49d5-9241-bd8e8e7706c5-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.299625 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/64c2a4a6-267c-484e-b36f-95d7540531ef-dbus-socket\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.317067 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vlbl\" (UniqueName: \"kubernetes.io/projected/64c2a4a6-267c-484e-b36f-95d7540531ef-kube-api-access-9vlbl\") pod \"nmstate-handler-lj8qk\" (UID: \"64c2a4a6-267c-484e-b36f-95d7540531ef\") " pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.321851 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwtc9\" (UniqueName: \"kubernetes.io/projected/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-kube-api-access-zwtc9\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.321941 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv65j\" (UniqueName: \"kubernetes.io/projected/d26121b1-736e-49d5-9241-bd8e8e7706c5-kube-api-access-pv65j\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.380975 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bb8cc4cf6-w4mnx"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.381693 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.383284 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.438736 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bb8cc4cf6-w4mnx"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.440981 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498307 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-service-ca\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498586 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwcmr\" (UniqueName: \"kubernetes.io/projected/432f4bd5-b473-438c-98b7-cd5dd5a15491-kube-api-access-vwcmr\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498620 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-oauth-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498636 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498666 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498686 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-trusted-ca-bundle\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.498726 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-oauth-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599459 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-service-ca\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599495 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwcmr\" (UniqueName: \"kubernetes.io/projected/432f4bd5-b473-438c-98b7-cd5dd5a15491-kube-api-access-vwcmr\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599525 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-oauth-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599541 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599567 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599586 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-trusted-ca-bundle\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.599617 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-oauth-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.600376 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-service-ca\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.600596 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.600665 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-oauth-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.601628 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432f4bd5-b473-438c-98b7-cd5dd5a15491-trusted-ca-bundle\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.604981 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-oauth-config\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.605599 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/432f4bd5-b473-438c-98b7-cd5dd5a15491-console-serving-cert\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.616843 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwcmr\" (UniqueName: \"kubernetes.io/projected/432f4bd5-b473-438c-98b7-cd5dd5a15491-kube-api-access-vwcmr\") pod \"console-bb8cc4cf6-w4mnx\" (UID: \"432f4bd5-b473-438c-98b7-cd5dd5a15491\") " pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.701324 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.760920 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.802680 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.802763 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.812051 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26121b1-736e-49d5-9241-bd8e8e7706c5-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fcg68\" (UID: \"d26121b1-736e-49d5-9241-bd8e8e7706c5\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.812859 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd0f36c5-926b-4678-b2d0-342a3f2f1d1f-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-pjkfn\" (UID: \"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:25 crc kubenswrapper[4739]: I1008 22:01:25.816450 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2"] Oct 08 22:01:25 crc kubenswrapper[4739]: W1008 22:01:25.821543 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad883073_96a0_4558_9517_2f59f2e1472e.slice/crio-3e881bd73ef352bd3e6537ffa67780df7394817ca7763dc4b176c5a4c26ad5c6 WatchSource:0}: Error finding container 3e881bd73ef352bd3e6537ffa67780df7394817ca7763dc4b176c5a4c26ad5c6: Status 404 returned error can't find the container with id 3e881bd73ef352bd3e6537ffa67780df7394817ca7763dc4b176c5a4c26ad5c6 Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.021076 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.070355 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" event={"ID":"ad883073-96a0-4558-9517-2f59f2e1472e","Type":"ContainerStarted","Data":"3e881bd73ef352bd3e6537ffa67780df7394817ca7763dc4b176c5a4c26ad5c6"} Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.073661 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lj8qk" event={"ID":"64c2a4a6-267c-484e-b36f-95d7540531ef","Type":"ContainerStarted","Data":"dde5964ed4f0e0f7e17c15c639fa6b557a952b0669170ee24afc35ba528b4cc0"} Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.074032 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9x4km" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="registry-server" containerID="cri-o://04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec" gracePeriod=2 Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.104067 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.160476 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bb8cc4cf6-w4mnx"] Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.500741 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn"] Oct 08 22:01:26 crc kubenswrapper[4739]: W1008 22:01:26.508970 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0f36c5_926b_4678_b2d0_342a3f2f1d1f.slice/crio-c389c70cf25531821ea451b3fc7e59d419167225d833855db9da6f5c16b11ebb WatchSource:0}: Error finding container c389c70cf25531821ea451b3fc7e59d419167225d833855db9da6f5c16b11ebb: Status 404 returned error can't find the container with id c389c70cf25531821ea451b3fc7e59d419167225d833855db9da6f5c16b11ebb Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.568105 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.643042 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68"] Oct 08 22:01:26 crc kubenswrapper[4739]: W1008 22:01:26.646497 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26121b1_736e_49d5_9241_bd8e8e7706c5.slice/crio-7d515ef6cd63a4238595f8a1c26eae6f0c03ab1b95de35fc18d8f02cc628524c WatchSource:0}: Error finding container 7d515ef6cd63a4238595f8a1c26eae6f0c03ab1b95de35fc18d8f02cc628524c: Status 404 returned error can't find the container with id 7d515ef6cd63a4238595f8a1c26eae6f0c03ab1b95de35fc18d8f02cc628524c Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.716470 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content\") pod \"a98c78af-858f-4d7b-92a3-17f57764c156\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.716643 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities\") pod \"a98c78af-858f-4d7b-92a3-17f57764c156\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.716681 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzfxb\" (UniqueName: \"kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb\") pod \"a98c78af-858f-4d7b-92a3-17f57764c156\" (UID: \"a98c78af-858f-4d7b-92a3-17f57764c156\") " Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.717667 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities" (OuterVolumeSpecName: "utilities") pod "a98c78af-858f-4d7b-92a3-17f57764c156" (UID: "a98c78af-858f-4d7b-92a3-17f57764c156"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.722941 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb" (OuterVolumeSpecName: "kube-api-access-fzfxb") pod "a98c78af-858f-4d7b-92a3-17f57764c156" (UID: "a98c78af-858f-4d7b-92a3-17f57764c156"). InnerVolumeSpecName "kube-api-access-fzfxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.793702 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a98c78af-858f-4d7b-92a3-17f57764c156" (UID: "a98c78af-858f-4d7b-92a3-17f57764c156"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.817396 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.817419 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a98c78af-858f-4d7b-92a3-17f57764c156-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:26 crc kubenswrapper[4739]: I1008 22:01:26.817429 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzfxb\" (UniqueName: \"kubernetes.io/projected/a98c78af-858f-4d7b-92a3-17f57764c156-kube-api-access-fzfxb\") on node \"crc\" DevicePath \"\"" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.083348 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bb8cc4cf6-w4mnx" event={"ID":"432f4bd5-b473-438c-98b7-cd5dd5a15491","Type":"ContainerStarted","Data":"42885bb68cd26634aab1f50f7a6ce4aa05f5bc0501fe281ff8e135ee85029163"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.083401 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bb8cc4cf6-w4mnx" event={"ID":"432f4bd5-b473-438c-98b7-cd5dd5a15491","Type":"ContainerStarted","Data":"120e60819486925414bfd8f4491277b3c5ce377f8c40a0ce19228bbc395f0020"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.085289 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" event={"ID":"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f","Type":"ContainerStarted","Data":"c389c70cf25531821ea451b3fc7e59d419167225d833855db9da6f5c16b11ebb"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.089137 4739 generic.go:334] "Generic (PLEG): container finished" podID="a98c78af-858f-4d7b-92a3-17f57764c156" containerID="04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec" exitCode=0 Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.089180 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9x4km" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.089187 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerDied","Data":"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.089384 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9x4km" event={"ID":"a98c78af-858f-4d7b-92a3-17f57764c156","Type":"ContainerDied","Data":"930fdb2bc704bad98713986a3a6faa0ce4c603dcb80241b41bbf42159b254bc4"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.089407 4739 scope.go:117] "RemoveContainer" containerID="04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.090416 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" event={"ID":"d26121b1-736e-49d5-9241-bd8e8e7706c5","Type":"ContainerStarted","Data":"7d515ef6cd63a4238595f8a1c26eae6f0c03ab1b95de35fc18d8f02cc628524c"} Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.109430 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bb8cc4cf6-w4mnx" podStartSLOduration=2.10940829 podStartE2EDuration="2.10940829s" podCreationTimestamp="2025-10-08 22:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:01:27.097904818 +0000 UTC m=+786.923290588" watchObservedRunningTime="2025-10-08 22:01:27.10940829 +0000 UTC m=+786.934794040" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.119068 4739 scope.go:117] "RemoveContainer" containerID="fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.120113 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.122473 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9x4km"] Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.132641 4739 scope.go:117] "RemoveContainer" containerID="20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.146610 4739 scope.go:117] "RemoveContainer" containerID="04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec" Oct 08 22:01:27 crc kubenswrapper[4739]: E1008 22:01:27.147554 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec\": container with ID starting with 04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec not found: ID does not exist" containerID="04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.147651 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec"} err="failed to get container status \"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec\": rpc error: code = NotFound desc = could not find container \"04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec\": container with ID starting with 04cf24130eccc8f691b40031d0771f23dccc8682582969dc4f2fe786c68db7ec not found: ID does not exist" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.147735 4739 scope.go:117] "RemoveContainer" containerID="fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695" Oct 08 22:01:27 crc kubenswrapper[4739]: E1008 22:01:27.148056 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695\": container with ID starting with fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695 not found: ID does not exist" containerID="fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.148086 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695"} err="failed to get container status \"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695\": rpc error: code = NotFound desc = could not find container \"fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695\": container with ID starting with fce18f04e3eced98470aa09ceebaec000d21293ae95569d7ca579c8657792695 not found: ID does not exist" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.148109 4739 scope.go:117] "RemoveContainer" containerID="20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c" Oct 08 22:01:27 crc kubenswrapper[4739]: E1008 22:01:27.148334 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c\": container with ID starting with 20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c not found: ID does not exist" containerID="20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.148372 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c"} err="failed to get container status \"20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c\": rpc error: code = NotFound desc = could not find container \"20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c\": container with ID starting with 20234d3b258cc5c6c5128ce90f17e30079a37151257e78b3b93e57d8ed7e3e8c not found: ID does not exist" Oct 08 22:01:27 crc kubenswrapper[4739]: I1008 22:01:27.834559 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" path="/var/lib/kubelet/pods/a98c78af-858f-4d7b-92a3-17f57764c156/volumes" Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.112908 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" event={"ID":"d26121b1-736e-49d5-9241-bd8e8e7706c5","Type":"ContainerStarted","Data":"0dc372e523264cab26e0052d17b742b61efdcf5f9161d82277d052f63ef06f9e"} Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.115827 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" event={"ID":"ad883073-96a0-4558-9517-2f59f2e1472e","Type":"ContainerStarted","Data":"cf965a7473d63407b9c3454946f06055854df948d88eb2d0a99f96eba2a850fe"} Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.117419 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lj8qk" event={"ID":"64c2a4a6-267c-484e-b36f-95d7540531ef","Type":"ContainerStarted","Data":"66501f77879170a61d85944c826240c975090dcc93eefe41a90c10c1e5a6a8d0"} Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.117554 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.118691 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" event={"ID":"dd0f36c5-926b-4678-b2d0-342a3f2f1d1f","Type":"ContainerStarted","Data":"66ccea2ca3e3b8f4d6df9cd333c84126c683d38e776184001ee182b51446d509"} Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.118855 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.130834 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fcg68" podStartSLOduration=1.855173511 podStartE2EDuration="4.130808057s" podCreationTimestamp="2025-10-08 22:01:25 +0000 UTC" firstStartedPulling="2025-10-08 22:01:26.648889736 +0000 UTC m=+786.474275486" lastFinishedPulling="2025-10-08 22:01:28.924524282 +0000 UTC m=+788.749910032" observedRunningTime="2025-10-08 22:01:29.127387632 +0000 UTC m=+788.952773392" watchObservedRunningTime="2025-10-08 22:01:29.130808057 +0000 UTC m=+788.956193807" Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.151089 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" podStartSLOduration=1.7481144180000001 podStartE2EDuration="4.151069183s" podCreationTimestamp="2025-10-08 22:01:25 +0000 UTC" firstStartedPulling="2025-10-08 22:01:26.511755437 +0000 UTC m=+786.337141187" lastFinishedPulling="2025-10-08 22:01:28.914710202 +0000 UTC m=+788.740095952" observedRunningTime="2025-10-08 22:01:29.14480295 +0000 UTC m=+788.970188700" watchObservedRunningTime="2025-10-08 22:01:29.151069183 +0000 UTC m=+788.976454933" Oct 08 22:01:29 crc kubenswrapper[4739]: I1008 22:01:29.157457 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lj8qk" podStartSLOduration=1.536603406 podStartE2EDuration="4.157435949s" podCreationTimestamp="2025-10-08 22:01:25 +0000 UTC" firstStartedPulling="2025-10-08 22:01:25.461528645 +0000 UTC m=+785.286914395" lastFinishedPulling="2025-10-08 22:01:28.082361188 +0000 UTC m=+787.907746938" observedRunningTime="2025-10-08 22:01:29.156087366 +0000 UTC m=+788.981473116" watchObservedRunningTime="2025-10-08 22:01:29.157435949 +0000 UTC m=+788.982821699" Oct 08 22:01:32 crc kubenswrapper[4739]: I1008 22:01:32.135716 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" event={"ID":"ad883073-96a0-4558-9517-2f59f2e1472e","Type":"ContainerStarted","Data":"8fe89a756adfc908c71a26f0e81717cf67798d16b9e14361f128ea1dfa454066"} Oct 08 22:01:32 crc kubenswrapper[4739]: I1008 22:01:32.157124 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pz8z2" podStartSLOduration=1.7185344809999998 podStartE2EDuration="7.157109434s" podCreationTimestamp="2025-10-08 22:01:25 +0000 UTC" firstStartedPulling="2025-10-08 22:01:25.823299408 +0000 UTC m=+785.648685158" lastFinishedPulling="2025-10-08 22:01:31.261874351 +0000 UTC m=+791.087260111" observedRunningTime="2025-10-08 22:01:32.155133876 +0000 UTC m=+791.980519626" watchObservedRunningTime="2025-10-08 22:01:32.157109434 +0000 UTC m=+791.982495174" Oct 08 22:01:35 crc kubenswrapper[4739]: I1008 22:01:35.469629 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lj8qk" Oct 08 22:01:35 crc kubenswrapper[4739]: I1008 22:01:35.701930 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:35 crc kubenswrapper[4739]: I1008 22:01:35.702012 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:35 crc kubenswrapper[4739]: I1008 22:01:35.710039 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:36 crc kubenswrapper[4739]: I1008 22:01:36.169557 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bb8cc4cf6-w4mnx" Oct 08 22:01:36 crc kubenswrapper[4739]: I1008 22:01:36.247189 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 22:01:46 crc kubenswrapper[4739]: I1008 22:01:46.030375 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-pjkfn" Oct 08 22:01:51 crc kubenswrapper[4739]: I1008 22:01:51.766640 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:01:51 crc kubenswrapper[4739]: I1008 22:01:51.767090 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.353695 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:01:59 crc kubenswrapper[4739]: E1008 22:01:59.354826 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="extract-utilities" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.354849 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="extract-utilities" Oct 08 22:01:59 crc kubenswrapper[4739]: E1008 22:01:59.354892 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="extract-content" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.354928 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="extract-content" Oct 08 22:01:59 crc kubenswrapper[4739]: E1008 22:01:59.354947 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="registry-server" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.354961 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="registry-server" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.355255 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a98c78af-858f-4d7b-92a3-17f57764c156" containerName="registry-server" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.356805 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.356964 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.470857 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.470915 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlstv\" (UniqueName: \"kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.470949 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.572293 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.572477 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlstv\" (UniqueName: \"kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.572594 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.573396 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.573660 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.592556 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlstv\" (UniqueName: \"kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv\") pod \"certified-operators-tpg9r\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:01:59 crc kubenswrapper[4739]: I1008 22:01:59.692134 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:00 crc kubenswrapper[4739]: I1008 22:02:00.153516 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:02:00 crc kubenswrapper[4739]: I1008 22:02:00.329006 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerStarted","Data":"b7af86fb14723d6e35326448a78113122e02f35d14da78b1ddcaa76c2e9fae99"} Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.292432 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-n8lrt" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" containerID="cri-o://9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f" gracePeriod=15 Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.337343 4739 generic.go:334] "Generic (PLEG): container finished" podID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerID="fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe" exitCode=0 Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.337430 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerDied","Data":"fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe"} Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.712347 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n8lrt_e3d665b0-ab57-47b7-9a58-9c6c150d6105/console/0.log" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.713265 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802271 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802360 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802386 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802403 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802445 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4w28\" (UniqueName: \"kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802486 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.802512 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert\") pod \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\" (UID: \"e3d665b0-ab57-47b7-9a58-9c6c150d6105\") " Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.803539 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config" (OuterVolumeSpecName: "console-config") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.803561 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.804653 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.804752 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca" (OuterVolumeSpecName: "service-ca") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.808977 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28" (OuterVolumeSpecName: "kube-api-access-w4w28") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "kube-api-access-w4w28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.809199 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.809616 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e3d665b0-ab57-47b7-9a58-9c6c150d6105" (UID: "e3d665b0-ab57-47b7-9a58-9c6c150d6105"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904377 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4w28\" (UniqueName: \"kubernetes.io/projected/e3d665b0-ab57-47b7-9a58-9c6c150d6105-kube-api-access-w4w28\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904427 4739 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904446 4739 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904463 4739 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904479 4739 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904501 4739 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:01 crc kubenswrapper[4739]: I1008 22:02:01.904516 4739 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d665b0-ab57-47b7-9a58-9c6c150d6105-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348345 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n8lrt_e3d665b0-ab57-47b7-9a58-9c6c150d6105/console/0.log" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348393 4739 generic.go:334] "Generic (PLEG): container finished" podID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerID="9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f" exitCode=2 Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348425 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n8lrt" event={"ID":"e3d665b0-ab57-47b7-9a58-9c6c150d6105","Type":"ContainerDied","Data":"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f"} Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348454 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n8lrt" event={"ID":"e3d665b0-ab57-47b7-9a58-9c6c150d6105","Type":"ContainerDied","Data":"34780b843b9b80baf695a65c9e62cfa5c19dfd149662c3f9b85021f5a17d2d8c"} Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348472 4739 scope.go:117] "RemoveContainer" containerID="9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.348602 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n8lrt" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.371298 4739 scope.go:117] "RemoveContainer" containerID="9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.372313 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 22:02:02 crc kubenswrapper[4739]: E1008 22:02:02.373230 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f\": container with ID starting with 9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f not found: ID does not exist" containerID="9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.373425 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f"} err="failed to get container status \"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f\": rpc error: code = NotFound desc = could not find container \"9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f\": container with ID starting with 9944988e55a1de20cf7da1af5838a1cb5af1f738fbd17aa34eec263e0958729f not found: ID does not exist" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.376296 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-n8lrt"] Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.975226 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8"] Oct 08 22:02:02 crc kubenswrapper[4739]: E1008 22:02:02.975572 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.975595 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.975782 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" containerName="console" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.977172 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.978969 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:02:02 crc kubenswrapper[4739]: I1008 22:02:02.984124 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8"] Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.120357 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjh2\" (UniqueName: \"kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.120504 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.120733 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.222012 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.222090 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjh2\" (UniqueName: \"kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.222124 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.222490 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.222521 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.245404 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjh2\" (UniqueName: \"kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.294623 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.358683 4739 generic.go:334] "Generic (PLEG): container finished" podID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerID="20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf" exitCode=0 Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.358785 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerDied","Data":"20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf"} Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.700092 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8"] Oct 08 22:02:03 crc kubenswrapper[4739]: W1008 22:02:03.709280 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4c31bf_494f_4278_97be_ef83f58c5c1b.slice/crio-bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6 WatchSource:0}: Error finding container bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6: Status 404 returned error can't find the container with id bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6 Oct 08 22:02:03 crc kubenswrapper[4739]: I1008 22:02:03.828962 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d665b0-ab57-47b7-9a58-9c6c150d6105" path="/var/lib/kubelet/pods/e3d665b0-ab57-47b7-9a58-9c6c150d6105/volumes" Oct 08 22:02:04 crc kubenswrapper[4739]: I1008 22:02:04.370402 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerStarted","Data":"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304"} Oct 08 22:02:04 crc kubenswrapper[4739]: I1008 22:02:04.372991 4739 generic.go:334] "Generic (PLEG): container finished" podID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerID="202e94550f07c2a043cefa212ca344d3dbda67439290e0acdf5834bf9087a297" exitCode=0 Oct 08 22:02:04 crc kubenswrapper[4739]: I1008 22:02:04.373041 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" event={"ID":"ca4c31bf-494f-4278-97be-ef83f58c5c1b","Type":"ContainerDied","Data":"202e94550f07c2a043cefa212ca344d3dbda67439290e0acdf5834bf9087a297"} Oct 08 22:02:04 crc kubenswrapper[4739]: I1008 22:02:04.373241 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" event={"ID":"ca4c31bf-494f-4278-97be-ef83f58c5c1b","Type":"ContainerStarted","Data":"bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6"} Oct 08 22:02:04 crc kubenswrapper[4739]: I1008 22:02:04.393353 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tpg9r" podStartSLOduration=2.8357584510000002 podStartE2EDuration="5.393318754s" podCreationTimestamp="2025-10-08 22:01:59 +0000 UTC" firstStartedPulling="2025-10-08 22:02:01.339194625 +0000 UTC m=+821.164580385" lastFinishedPulling="2025-10-08 22:02:03.896754948 +0000 UTC m=+823.722140688" observedRunningTime="2025-10-08 22:02:04.392048053 +0000 UTC m=+824.217433813" watchObservedRunningTime="2025-10-08 22:02:04.393318754 +0000 UTC m=+824.218704594" Oct 08 22:02:06 crc kubenswrapper[4739]: I1008 22:02:06.388321 4739 generic.go:334] "Generic (PLEG): container finished" podID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerID="fe62bf5b88215020a2f37faaa4adf3fa3827f75a748d85fa471eeb4c76d56b7a" exitCode=0 Oct 08 22:02:06 crc kubenswrapper[4739]: I1008 22:02:06.388386 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" event={"ID":"ca4c31bf-494f-4278-97be-ef83f58c5c1b","Type":"ContainerDied","Data":"fe62bf5b88215020a2f37faaa4adf3fa3827f75a748d85fa471eeb4c76d56b7a"} Oct 08 22:02:07 crc kubenswrapper[4739]: I1008 22:02:07.397232 4739 generic.go:334] "Generic (PLEG): container finished" podID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerID="fd216061d7ffea667b2e686d5001a35b9c9090a6f09539e98fd98b2d292b66a9" exitCode=0 Oct 08 22:02:07 crc kubenswrapper[4739]: I1008 22:02:07.397279 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" event={"ID":"ca4c31bf-494f-4278-97be-ef83f58c5c1b","Type":"ContainerDied","Data":"fd216061d7ffea667b2e686d5001a35b9c9090a6f09539e98fd98b2d292b66a9"} Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.690468 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.800243 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util\") pod \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.801473 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjh2\" (UniqueName: \"kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2\") pod \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.801787 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle\") pod \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\" (UID: \"ca4c31bf-494f-4278-97be-ef83f58c5c1b\") " Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.802951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle" (OuterVolumeSpecName: "bundle") pod "ca4c31bf-494f-4278-97be-ef83f58c5c1b" (UID: "ca4c31bf-494f-4278-97be-ef83f58c5c1b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.806752 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2" (OuterVolumeSpecName: "kube-api-access-khjh2") pod "ca4c31bf-494f-4278-97be-ef83f58c5c1b" (UID: "ca4c31bf-494f-4278-97be-ef83f58c5c1b"). InnerVolumeSpecName "kube-api-access-khjh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.831782 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util" (OuterVolumeSpecName: "util") pod "ca4c31bf-494f-4278-97be-ef83f58c5c1b" (UID: "ca4c31bf-494f-4278-97be-ef83f58c5c1b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.902418 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.902453 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca4c31bf-494f-4278-97be-ef83f58c5c1b-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:08 crc kubenswrapper[4739]: I1008 22:02:08.902463 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjh2\" (UniqueName: \"kubernetes.io/projected/ca4c31bf-494f-4278-97be-ef83f58c5c1b-kube-api-access-khjh2\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.416079 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" event={"ID":"ca4c31bf-494f-4278-97be-ef83f58c5c1b","Type":"ContainerDied","Data":"bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6"} Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.416130 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae6d9f8e36fcdd5f4205bd2351f02ea239512aa3ad3ae71d3ee016d057ac8f6" Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.416133 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8" Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.693303 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.693357 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:09 crc kubenswrapper[4739]: I1008 22:02:09.750816 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:10 crc kubenswrapper[4739]: I1008 22:02:10.471118 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.120722 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.434361 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tpg9r" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="registry-server" containerID="cri-o://d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304" gracePeriod=2 Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.829536 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.861650 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities\") pod \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.861701 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlstv\" (UniqueName: \"kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv\") pod \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.861734 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content\") pod \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\" (UID: \"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6\") " Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.862448 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities" (OuterVolumeSpecName: "utilities") pod "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" (UID: "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.879343 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv" (OuterVolumeSpecName: "kube-api-access-qlstv") pod "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" (UID: "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6"). InnerVolumeSpecName "kube-api-access-qlstv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.963381 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:12 crc kubenswrapper[4739]: I1008 22:02:12.963419 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlstv\" (UniqueName: \"kubernetes.io/projected/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-kube-api-access-qlstv\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.370844 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" (UID: "bfbd1436-a8a5-40a7-bc43-abebb26fdfa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.441877 4739 generic.go:334] "Generic (PLEG): container finished" podID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerID="d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304" exitCode=0 Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.441968 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tpg9r" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.441963 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerDied","Data":"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304"} Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.442075 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tpg9r" event={"ID":"bfbd1436-a8a5-40a7-bc43-abebb26fdfa6","Type":"ContainerDied","Data":"b7af86fb14723d6e35326448a78113122e02f35d14da78b1ddcaa76c2e9fae99"} Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.442106 4739 scope.go:117] "RemoveContainer" containerID="d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.461998 4739 scope.go:117] "RemoveContainer" containerID="20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.469626 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.473610 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.483498 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tpg9r"] Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.486052 4739 scope.go:117] "RemoveContainer" containerID="fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.509048 4739 scope.go:117] "RemoveContainer" containerID="d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304" Oct 08 22:02:13 crc kubenswrapper[4739]: E1008 22:02:13.509588 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304\": container with ID starting with d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304 not found: ID does not exist" containerID="d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.509620 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304"} err="failed to get container status \"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304\": rpc error: code = NotFound desc = could not find container \"d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304\": container with ID starting with d205a89cf7738ae22ddb26135264ba7dc6c0dc3ce9ca8770ec6f4e074ee2b304 not found: ID does not exist" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.509648 4739 scope.go:117] "RemoveContainer" containerID="20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf" Oct 08 22:02:13 crc kubenswrapper[4739]: E1008 22:02:13.510039 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf\": container with ID starting with 20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf not found: ID does not exist" containerID="20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.510060 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf"} err="failed to get container status \"20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf\": rpc error: code = NotFound desc = could not find container \"20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf\": container with ID starting with 20cd739184591bd72c0c63232412a30f01914d393cf04cff2980dfea0d746cbf not found: ID does not exist" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.510073 4739 scope.go:117] "RemoveContainer" containerID="fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe" Oct 08 22:02:13 crc kubenswrapper[4739]: E1008 22:02:13.510419 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe\": container with ID starting with fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe not found: ID does not exist" containerID="fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.510472 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe"} err="failed to get container status \"fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe\": rpc error: code = NotFound desc = could not find container \"fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe\": container with ID starting with fc20e2c9b0c1d839b9b987179807b229509158e2735840d49e13df40f9a87cbe not found: ID does not exist" Oct 08 22:02:13 crc kubenswrapper[4739]: I1008 22:02:13.832080 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" path="/var/lib/kubelet/pods/bfbd1436-a8a5-40a7-bc43-abebb26fdfa6/volumes" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753635 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp"] Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753840 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="extract-utilities" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753851 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="extract-utilities" Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753862 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="extract" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753869 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="extract" Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753877 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="registry-server" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753883 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="registry-server" Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753892 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="extract-content" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753898 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="extract-content" Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753906 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="pull" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753911 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="pull" Oct 08 22:02:16 crc kubenswrapper[4739]: E1008 22:02:16.753925 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="util" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.753930 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="util" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.754019 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4c31bf-494f-4278-97be-ef83f58c5c1b" containerName="extract" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.754034 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbd1436-a8a5-40a7-bc43-abebb26fdfa6" containerName="registry-server" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.754409 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.758851 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.758926 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.761637 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.761719 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tbvrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.761777 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.786565 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp"] Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.815999 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.816261 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-webhook-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.816413 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7242l\" (UniqueName: \"kubernetes.io/projected/091a7a04-1c08-4327-8d95-e63d3b526055-kube-api-access-7242l\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.917465 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.917503 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-webhook-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.917542 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7242l\" (UniqueName: \"kubernetes.io/projected/091a7a04-1c08-4327-8d95-e63d3b526055-kube-api-access-7242l\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.923343 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-apiservice-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.923917 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/091a7a04-1c08-4327-8d95-e63d3b526055-webhook-cert\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:16 crc kubenswrapper[4739]: I1008 22:02:16.934914 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7242l\" (UniqueName: \"kubernetes.io/projected/091a7a04-1c08-4327-8d95-e63d3b526055-kube-api-access-7242l\") pod \"metallb-operator-controller-manager-6cb897566c-v8wrp\" (UID: \"091a7a04-1c08-4327-8d95-e63d3b526055\") " pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.071058 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.079123 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh"] Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.079796 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.089748 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.090017 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.090428 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j4r8h" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.096537 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh"] Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.122650 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-apiservice-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.122704 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jnt\" (UniqueName: \"kubernetes.io/projected/1b3a65cd-e578-4a5b-acfe-47ec21816d80-kube-api-access-88jnt\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.122741 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-webhook-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.223802 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-apiservice-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.224165 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jnt\" (UniqueName: \"kubernetes.io/projected/1b3a65cd-e578-4a5b-acfe-47ec21816d80-kube-api-access-88jnt\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.224205 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-webhook-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.231178 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-apiservice-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.247463 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b3a65cd-e578-4a5b-acfe-47ec21816d80-webhook-cert\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.255105 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jnt\" (UniqueName: \"kubernetes.io/projected/1b3a65cd-e578-4a5b-acfe-47ec21816d80-kube-api-access-88jnt\") pod \"metallb-operator-webhook-server-5944674dc5-rrrsh\" (UID: \"1b3a65cd-e578-4a5b-acfe-47ec21816d80\") " pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.429213 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.578481 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp"] Oct 08 22:02:17 crc kubenswrapper[4739]: I1008 22:02:17.928753 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh"] Oct 08 22:02:17 crc kubenswrapper[4739]: W1008 22:02:17.936199 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b3a65cd_e578_4a5b_acfe_47ec21816d80.slice/crio-68b7616461c5e8199d8c05d82b0d27011d49f415c9c62413806ae51f30b980c0 WatchSource:0}: Error finding container 68b7616461c5e8199d8c05d82b0d27011d49f415c9c62413806ae51f30b980c0: Status 404 returned error can't find the container with id 68b7616461c5e8199d8c05d82b0d27011d49f415c9c62413806ae51f30b980c0 Oct 08 22:02:18 crc kubenswrapper[4739]: I1008 22:02:18.480725 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" event={"ID":"091a7a04-1c08-4327-8d95-e63d3b526055","Type":"ContainerStarted","Data":"01f85d87cfe39b239eb482e4fe4c8c5e6a854132b6c781f8607bdfb46fa3219a"} Oct 08 22:02:18 crc kubenswrapper[4739]: I1008 22:02:18.482836 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" event={"ID":"1b3a65cd-e578-4a5b-acfe-47ec21816d80","Type":"ContainerStarted","Data":"68b7616461c5e8199d8c05d82b0d27011d49f415c9c62413806ae51f30b980c0"} Oct 08 22:02:21 crc kubenswrapper[4739]: I1008 22:02:21.767715 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:02:21 crc kubenswrapper[4739]: I1008 22:02:21.768863 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:02:21 crc kubenswrapper[4739]: I1008 22:02:21.768917 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:02:21 crc kubenswrapper[4739]: I1008 22:02:21.771065 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:02:21 crc kubenswrapper[4739]: I1008 22:02:21.771569 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8" gracePeriod=600 Oct 08 22:02:22 crc kubenswrapper[4739]: I1008 22:02:22.520537 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8" exitCode=0 Oct 08 22:02:22 crc kubenswrapper[4739]: I1008 22:02:22.520587 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8"} Oct 08 22:02:22 crc kubenswrapper[4739]: I1008 22:02:22.520637 4739 scope.go:117] "RemoveContainer" containerID="ef2588b0bb234b34c79c8ff569837da6abbcb63d53a58f2ae5f4cde5f6ddd2c2" Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.527865 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84"} Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.529728 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" event={"ID":"1b3a65cd-e578-4a5b-acfe-47ec21816d80","Type":"ContainerStarted","Data":"1aa92fde8decf1eee0a26e950ea1afd634c8fc184daef636f4fe8e41b9869374"} Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.530056 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.531322 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" event={"ID":"091a7a04-1c08-4327-8d95-e63d3b526055","Type":"ContainerStarted","Data":"be4e389fc1a27033ca43f149e0a57afc5d15471460b21b87247d80f1c76abf99"} Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.531656 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.563464 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" podStartSLOduration=2.05264695 podStartE2EDuration="7.5634485s" podCreationTimestamp="2025-10-08 22:02:16 +0000 UTC" firstStartedPulling="2025-10-08 22:02:17.610757444 +0000 UTC m=+837.436143194" lastFinishedPulling="2025-10-08 22:02:23.121559004 +0000 UTC m=+842.946944744" observedRunningTime="2025-10-08 22:02:23.562807575 +0000 UTC m=+843.388193335" watchObservedRunningTime="2025-10-08 22:02:23.5634485 +0000 UTC m=+843.388834250" Oct 08 22:02:23 crc kubenswrapper[4739]: I1008 22:02:23.592825 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" podStartSLOduration=1.372746864 podStartE2EDuration="6.5928072s" podCreationTimestamp="2025-10-08 22:02:17 +0000 UTC" firstStartedPulling="2025-10-08 22:02:17.938956675 +0000 UTC m=+837.764342435" lastFinishedPulling="2025-10-08 22:02:23.159017021 +0000 UTC m=+842.984402771" observedRunningTime="2025-10-08 22:02:23.591375155 +0000 UTC m=+843.416760915" watchObservedRunningTime="2025-10-08 22:02:23.5928072 +0000 UTC m=+843.418192950" Oct 08 22:02:37 crc kubenswrapper[4739]: I1008 22:02:37.433993 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5944674dc5-rrrsh" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.630867 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.634202 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.653544 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.684288 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.684451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjct\" (UniqueName: \"kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.684638 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.786752 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.786881 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjct\" (UniqueName: \"kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.786967 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.787684 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.787677 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.811200 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjct\" (UniqueName: \"kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct\") pod \"redhat-marketplace-hpgd5\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:50 crc kubenswrapper[4739]: I1008 22:02:50.968839 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:02:51 crc kubenswrapper[4739]: I1008 22:02:51.391038 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:02:51 crc kubenswrapper[4739]: I1008 22:02:51.707975 4739 generic.go:334] "Generic (PLEG): container finished" podID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerID="eafb58f9167dc3c59d0ec93380a1991f32b9bdd5b8e9e87a1d4329db465198cb" exitCode=0 Oct 08 22:02:51 crc kubenswrapper[4739]: I1008 22:02:51.708019 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerDied","Data":"eafb58f9167dc3c59d0ec93380a1991f32b9bdd5b8e9e87a1d4329db465198cb"} Oct 08 22:02:51 crc kubenswrapper[4739]: I1008 22:02:51.708050 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerStarted","Data":"2a400c8c87ecf5cf4a3f6ddbbacd3fe33dcf1f45602ae743bd5e27b9062533b2"} Oct 08 22:02:52 crc kubenswrapper[4739]: I1008 22:02:52.716034 4739 generic.go:334] "Generic (PLEG): container finished" podID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerID="9bf22bb7974431388421a708b722a463b152602e6debecfaaf6a9608bd5b9fce" exitCode=0 Oct 08 22:02:52 crc kubenswrapper[4739]: I1008 22:02:52.716070 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerDied","Data":"9bf22bb7974431388421a708b722a463b152602e6debecfaaf6a9608bd5b9fce"} Oct 08 22:02:53 crc kubenswrapper[4739]: I1008 22:02:53.723435 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerStarted","Data":"f4f0f8c45001fb3e9b45f2a3fd8904b0ad3d6f91a95223cfd1862aa74c535f00"} Oct 08 22:02:55 crc kubenswrapper[4739]: I1008 22:02:55.998254 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hpgd5" podStartSLOduration=4.382354645 podStartE2EDuration="5.998235375s" podCreationTimestamp="2025-10-08 22:02:50 +0000 UTC" firstStartedPulling="2025-10-08 22:02:51.709818525 +0000 UTC m=+871.535204295" lastFinishedPulling="2025-10-08 22:02:53.325699275 +0000 UTC m=+873.151085025" observedRunningTime="2025-10-08 22:02:53.742251241 +0000 UTC m=+873.567637011" watchObservedRunningTime="2025-10-08 22:02:55.998235375 +0000 UTC m=+875.823621125" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.000029 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.005804 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.024669 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.071301 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.071360 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmxl\" (UniqueName: \"kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.071399 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.172593 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmxl\" (UniqueName: \"kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.172647 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.172719 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.173244 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.173318 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.191754 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmxl\" (UniqueName: \"kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl\") pod \"community-operators-8vhpw\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.372256 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:02:56 crc kubenswrapper[4739]: I1008 22:02:56.803444 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:02:56 crc kubenswrapper[4739]: W1008 22:02:56.807120 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b2c098_ed87_4e2c_bf6f_bc95a8228f70.slice/crio-b10fdb93da42ec2f7587935b8e76fc59b0f2177218c996f7b5ed4a2982cb7f0c WatchSource:0}: Error finding container b10fdb93da42ec2f7587935b8e76fc59b0f2177218c996f7b5ed4a2982cb7f0c: Status 404 returned error can't find the container with id b10fdb93da42ec2f7587935b8e76fc59b0f2177218c996f7b5ed4a2982cb7f0c Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.074614 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cb897566c-v8wrp" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.746770 4739 generic.go:334] "Generic (PLEG): container finished" podID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerID="9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323" exitCode=0 Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.746973 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerDied","Data":"9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323"} Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.747092 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerStarted","Data":"b10fdb93da42ec2f7587935b8e76fc59b0f2177218c996f7b5ed4a2982cb7f0c"} Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.846630 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-phdnx"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.849500 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.851723 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.852654 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.853707 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.853880 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.853995 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-25p5l" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.854766 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.863040 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896584 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqlp\" (UniqueName: \"kubernetes.io/projected/545c9b85-f531-4665-ba7c-8997de325d62-kube-api-access-4zqlp\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896629 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvbf\" (UniqueName: \"kubernetes.io/projected/d4c4cac2-1e41-4504-8620-7ccda1212854-kube-api-access-qrvbf\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896650 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-metrics\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896683 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/545c9b85-f531-4665-ba7c-8997de325d62-frr-startup\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896709 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545c9b85-f531-4665-ba7c-8997de325d62-metrics-certs\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896747 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-reloader\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-sockets\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.896971 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-conf\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.897075 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.938028 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kn5nb"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.939209 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kn5nb" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.942695 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.942966 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.943020 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kd7ks" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.943339 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.959842 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-ps6c2"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.961020 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.965232 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.969911 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-ps6c2"] Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998227 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-reloader\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998329 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-sockets\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998368 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-cert\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998395 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-conf\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998421 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72mx\" (UniqueName: \"kubernetes.io/projected/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-kube-api-access-w72mx\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998444 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-metrics-certs\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998470 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998492 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f0c8acb-ceae-4aea-861e-396755963f03-metallb-excludel2\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998530 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998584 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqlp\" (UniqueName: \"kubernetes.io/projected/545c9b85-f531-4665-ba7c-8997de325d62-kube-api-access-4zqlp\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998607 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvbf\" (UniqueName: \"kubernetes.io/projected/d4c4cac2-1e41-4504-8620-7ccda1212854-kube-api-access-qrvbf\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998628 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-metrics\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998650 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998678 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/545c9b85-f531-4665-ba7c-8997de325d62-frr-startup\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998703 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545c9b85-f531-4665-ba7c-8997de325d62-metrics-certs\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.998730 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9wj\" (UniqueName: \"kubernetes.io/projected/6f0c8acb-ceae-4aea-861e-396755963f03-kube-api-access-pl9wj\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.999661 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-reloader\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:57 crc kubenswrapper[4739]: E1008 22:02:57.999754 4739 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 08 22:02:57 crc kubenswrapper[4739]: E1008 22:02:57.999871 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert podName:d4c4cac2-1e41-4504-8620-7ccda1212854 nodeName:}" failed. No retries permitted until 2025-10-08 22:02:58.499848516 +0000 UTC m=+878.325234346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert") pod "frr-k8s-webhook-server-64bf5d555-rmj6d" (UID: "d4c4cac2-1e41-4504-8620-7ccda1212854") : secret "frr-k8s-webhook-server-cert" not found Oct 08 22:02:57 crc kubenswrapper[4739]: I1008 22:02:57.999812 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-sockets\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:57.999983 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-metrics\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:57.999767 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/545c9b85-f531-4665-ba7c-8997de325d62-frr-conf\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.000937 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/545c9b85-f531-4665-ba7c-8997de325d62-frr-startup\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.005216 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545c9b85-f531-4665-ba7c-8997de325d62-metrics-certs\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.017290 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvbf\" (UniqueName: \"kubernetes.io/projected/d4c4cac2-1e41-4504-8620-7ccda1212854-kube-api-access-qrvbf\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.018407 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqlp\" (UniqueName: \"kubernetes.io/projected/545c9b85-f531-4665-ba7c-8997de325d62-kube-api-access-4zqlp\") pod \"frr-k8s-phdnx\" (UID: \"545c9b85-f531-4665-ba7c-8997de325d62\") " pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099516 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-cert\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099568 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72mx\" (UniqueName: \"kubernetes.io/projected/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-kube-api-access-w72mx\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099588 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-metrics-certs\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099619 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f0c8acb-ceae-4aea-861e-396755963f03-metallb-excludel2\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099654 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099704 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.099734 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9wj\" (UniqueName: \"kubernetes.io/projected/6f0c8acb-ceae-4aea-861e-396755963f03-kube-api-access-pl9wj\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.099879 4739 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.099942 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs podName:6f0c8acb-ceae-4aea-861e-396755963f03 nodeName:}" failed. No retries permitted until 2025-10-08 22:02:58.599923819 +0000 UTC m=+878.425309579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs") pod "speaker-kn5nb" (UID: "6f0c8acb-ceae-4aea-861e-396755963f03") : secret "speaker-certs-secret" not found Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.100293 4739 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.100350 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist podName:6f0c8acb-ceae-4aea-861e-396755963f03 nodeName:}" failed. No retries permitted until 2025-10-08 22:02:58.600338819 +0000 UTC m=+878.425724589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist") pod "speaker-kn5nb" (UID: "6f0c8acb-ceae-4aea-861e-396755963f03") : secret "metallb-memberlist" not found Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.100562 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6f0c8acb-ceae-4aea-861e-396755963f03-metallb-excludel2\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.101107 4739 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.104077 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-metrics-certs\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.116815 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-cert\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.123279 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72mx\" (UniqueName: \"kubernetes.io/projected/714c1b10-3e7c-4a8a-a346-8e37f9f476e6-kube-api-access-w72mx\") pod \"controller-68d546b9d8-ps6c2\" (UID: \"714c1b10-3e7c-4a8a-a346-8e37f9f476e6\") " pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.124805 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9wj\" (UniqueName: \"kubernetes.io/projected/6f0c8acb-ceae-4aea-861e-396755963f03-kube-api-access-pl9wj\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.178397 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.290041 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.507521 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.512338 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4c4cac2-1e41-4504-8620-7ccda1212854-cert\") pod \"frr-k8s-webhook-server-64bf5d555-rmj6d\" (UID: \"d4c4cac2-1e41-4504-8620-7ccda1212854\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.609311 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.609395 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.609610 4739 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 22:02:58 crc kubenswrapper[4739]: E1008 22:02:58.609656 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist podName:6f0c8acb-ceae-4aea-861e-396755963f03 nodeName:}" failed. No retries permitted until 2025-10-08 22:02:59.609642197 +0000 UTC m=+879.435027947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist") pod "speaker-kn5nb" (UID: "6f0c8acb-ceae-4aea-861e-396755963f03") : secret "metallb-memberlist" not found Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.614663 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-metrics-certs\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.739089 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-ps6c2"] Oct 08 22:02:58 crc kubenswrapper[4739]: W1008 22:02:58.752258 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714c1b10_3e7c_4a8a_a346_8e37f9f476e6.slice/crio-c3f18c22be1cc7e9da509c5ec08fa31de1d32540c5f2aab80294df2e1bbd3fc3 WatchSource:0}: Error finding container c3f18c22be1cc7e9da509c5ec08fa31de1d32540c5f2aab80294df2e1bbd3fc3: Status 404 returned error can't find the container with id c3f18c22be1cc7e9da509c5ec08fa31de1d32540c5f2aab80294df2e1bbd3fc3 Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.762123 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerStarted","Data":"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e"} Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.780199 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"5c568313ca697aa84ad6445a556e6eee7cc2979ea5aa64b6e8d9ce63c48fc7a2"} Oct 08 22:02:58 crc kubenswrapper[4739]: I1008 22:02:58.791196 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.196092 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d"] Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.620606 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.628886 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6f0c8acb-ceae-4aea-861e-396755963f03-memberlist\") pod \"speaker-kn5nb\" (UID: \"6f0c8acb-ceae-4aea-861e-396755963f03\") " pod="metallb-system/speaker-kn5nb" Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.759489 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kn5nb" Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.787339 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-ps6c2" event={"ID":"714c1b10-3e7c-4a8a-a346-8e37f9f476e6","Type":"ContainerStarted","Data":"2ba27e665f0fb31bd482725ce6edba7ade0c7615df305b5144e40ee48476c695"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.787382 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-ps6c2" event={"ID":"714c1b10-3e7c-4a8a-a346-8e37f9f476e6","Type":"ContainerStarted","Data":"c9469a98867de04d7e73aa7c1cf1f39c5d358b4318059355fa449d006458dce3"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.787396 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-ps6c2" event={"ID":"714c1b10-3e7c-4a8a-a346-8e37f9f476e6","Type":"ContainerStarted","Data":"c3f18c22be1cc7e9da509c5ec08fa31de1d32540c5f2aab80294df2e1bbd3fc3"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.787435 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.788553 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" event={"ID":"d4c4cac2-1e41-4504-8620-7ccda1212854","Type":"ContainerStarted","Data":"3d2488fc1450bb54c79ede0a96b5d244c3f4792cbb46ee89a1c5d6c2ebb7d4ca"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.792619 4739 generic.go:334] "Generic (PLEG): container finished" podID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerID="ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e" exitCode=0 Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.792670 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerDied","Data":"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.794949 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kn5nb" event={"ID":"6f0c8acb-ceae-4aea-861e-396755963f03","Type":"ContainerStarted","Data":"0daf2f208714c0418c08cb348fdffdf81aa276bcbde0c215d634eba549b924a5"} Oct 08 22:02:59 crc kubenswrapper[4739]: I1008 22:02:59.815336 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-ps6c2" podStartSLOduration=2.8153090069999998 podStartE2EDuration="2.815309007s" podCreationTimestamp="2025-10-08 22:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:02:59.808711615 +0000 UTC m=+879.634097395" watchObservedRunningTime="2025-10-08 22:02:59.815309007 +0000 UTC m=+879.640694787" Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.809010 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerStarted","Data":"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0"} Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.814804 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kn5nb" event={"ID":"6f0c8acb-ceae-4aea-861e-396755963f03","Type":"ContainerStarted","Data":"aee1138c9d929245d8c0028055a67b798310f4ed65a034cdff75b0e7c4485eea"} Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.814846 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kn5nb" event={"ID":"6f0c8acb-ceae-4aea-861e-396755963f03","Type":"ContainerStarted","Data":"d2f5b5efbecff551c07563f9c9fb0085ea319b72d452f22b7db3321729808136"} Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.830974 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vhpw" podStartSLOduration=3.178355699 podStartE2EDuration="5.830960641s" podCreationTimestamp="2025-10-08 22:02:55 +0000 UTC" firstStartedPulling="2025-10-08 22:02:57.749197555 +0000 UTC m=+877.574583325" lastFinishedPulling="2025-10-08 22:03:00.401802517 +0000 UTC m=+880.227188267" observedRunningTime="2025-10-08 22:03:00.827903187 +0000 UTC m=+880.653288937" watchObservedRunningTime="2025-10-08 22:03:00.830960641 +0000 UTC m=+880.656346391" Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.847826 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kn5nb" podStartSLOduration=3.847809144 podStartE2EDuration="3.847809144s" podCreationTimestamp="2025-10-08 22:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:03:00.843783795 +0000 UTC m=+880.669169565" watchObservedRunningTime="2025-10-08 22:03:00.847809144 +0000 UTC m=+880.673194894" Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.969623 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:00 crc kubenswrapper[4739]: I1008 22:03:00.970422 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:01 crc kubenswrapper[4739]: I1008 22:03:01.029791 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:01 crc kubenswrapper[4739]: I1008 22:03:01.831127 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kn5nb" Oct 08 22:03:01 crc kubenswrapper[4739]: I1008 22:03:01.876846 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:03 crc kubenswrapper[4739]: I1008 22:03:03.193359 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:03:04 crc kubenswrapper[4739]: I1008 22:03:04.840120 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hpgd5" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="registry-server" containerID="cri-o://f4f0f8c45001fb3e9b45f2a3fd8904b0ad3d6f91a95223cfd1862aa74c535f00" gracePeriod=2 Oct 08 22:03:05 crc kubenswrapper[4739]: I1008 22:03:05.846619 4739 generic.go:334] "Generic (PLEG): container finished" podID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerID="f4f0f8c45001fb3e9b45f2a3fd8904b0ad3d6f91a95223cfd1862aa74c535f00" exitCode=0 Oct 08 22:03:05 crc kubenswrapper[4739]: I1008 22:03:05.846659 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerDied","Data":"f4f0f8c45001fb3e9b45f2a3fd8904b0ad3d6f91a95223cfd1862aa74c535f00"} Oct 08 22:03:06 crc kubenswrapper[4739]: I1008 22:03:06.372702 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:06 crc kubenswrapper[4739]: I1008 22:03:06.372752 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:06 crc kubenswrapper[4739]: I1008 22:03:06.414139 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:06 crc kubenswrapper[4739]: I1008 22:03:06.919013 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.791769 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.840874 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.861243 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpgd5" event={"ID":"fbd6ae49-2147-4189-b8de-dc3a8e24e297","Type":"ContainerDied","Data":"2a400c8c87ecf5cf4a3f6ddbbacd3fe33dcf1f45602ae743bd5e27b9062533b2"} Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.861288 4739 scope.go:117] "RemoveContainer" containerID="f4f0f8c45001fb3e9b45f2a3fd8904b0ad3d6f91a95223cfd1862aa74c535f00" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.861331 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpgd5" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.911563 4739 scope.go:117] "RemoveContainer" containerID="9bf22bb7974431388421a708b722a463b152602e6debecfaaf6a9608bd5b9fce" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.941037 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content\") pod \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.941153 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kjct\" (UniqueName: \"kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct\") pod \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.941234 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities\") pod \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\" (UID: \"fbd6ae49-2147-4189-b8de-dc3a8e24e297\") " Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.944400 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities" (OuterVolumeSpecName: "utilities") pod "fbd6ae49-2147-4189-b8de-dc3a8e24e297" (UID: "fbd6ae49-2147-4189-b8de-dc3a8e24e297"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.949831 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct" (OuterVolumeSpecName: "kube-api-access-4kjct") pod "fbd6ae49-2147-4189-b8de-dc3a8e24e297" (UID: "fbd6ae49-2147-4189-b8de-dc3a8e24e297"). InnerVolumeSpecName "kube-api-access-4kjct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.954018 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbd6ae49-2147-4189-b8de-dc3a8e24e297" (UID: "fbd6ae49-2147-4189-b8de-dc3a8e24e297"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:07 crc kubenswrapper[4739]: I1008 22:03:07.963800 4739 scope.go:117] "RemoveContainer" containerID="eafb58f9167dc3c59d0ec93380a1991f32b9bdd5b8e9e87a1d4329db465198cb" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.042330 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.042359 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd6ae49-2147-4189-b8de-dc3a8e24e297-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.042369 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kjct\" (UniqueName: \"kubernetes.io/projected/fbd6ae49-2147-4189-b8de-dc3a8e24e297-kube-api-access-4kjct\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.195341 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.198326 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpgd5"] Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.296977 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-ps6c2" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.869624 4739 generic.go:334] "Generic (PLEG): container finished" podID="545c9b85-f531-4665-ba7c-8997de325d62" containerID="a65b5b930d55b15255113a000af7356deeb64446e582a71c086245ddb3cfe16d" exitCode=0 Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.869690 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerDied","Data":"a65b5b930d55b15255113a000af7356deeb64446e582a71c086245ddb3cfe16d"} Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.874037 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" event={"ID":"d4c4cac2-1e41-4504-8620-7ccda1212854","Type":"ContainerStarted","Data":"b5f238e7ca707639543efead876e78da53fc6719b4fda0fcdecdb5c89bcde034"} Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.874332 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.874314 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vhpw" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="registry-server" containerID="cri-o://964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0" gracePeriod=2 Oct 08 22:03:08 crc kubenswrapper[4739]: I1008 22:03:08.927227 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" podStartSLOduration=3.169975168 podStartE2EDuration="11.927208038s" podCreationTimestamp="2025-10-08 22:02:57 +0000 UTC" firstStartedPulling="2025-10-08 22:02:59.206555532 +0000 UTC m=+879.031941282" lastFinishedPulling="2025-10-08 22:03:07.963788402 +0000 UTC m=+887.789174152" observedRunningTime="2025-10-08 22:03:08.925356162 +0000 UTC m=+888.750741922" watchObservedRunningTime="2025-10-08 22:03:08.927208038 +0000 UTC m=+888.752593798" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.831537 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" path="/var/lib/kubelet/pods/fbd6ae49-2147-4189-b8de-dc3a8e24e297/volumes" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.835403 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.883786 4739 generic.go:334] "Generic (PLEG): container finished" podID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerID="964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0" exitCode=0 Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.883853 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerDied","Data":"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0"} Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.883884 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vhpw" event={"ID":"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70","Type":"ContainerDied","Data":"b10fdb93da42ec2f7587935b8e76fc59b0f2177218c996f7b5ed4a2982cb7f0c"} Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.883907 4739 scope.go:117] "RemoveContainer" containerID="964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.884054 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vhpw" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.887138 4739 generic.go:334] "Generic (PLEG): container finished" podID="545c9b85-f531-4665-ba7c-8997de325d62" containerID="b270024a98fc609ce2cb53cc6ec527ca6e6789a33e5ef66c7064f4a4ec73a719" exitCode=0 Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.887255 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerDied","Data":"b270024a98fc609ce2cb53cc6ec527ca6e6789a33e5ef66c7064f4a4ec73a719"} Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.913278 4739 scope.go:117] "RemoveContainer" containerID="ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.940761 4739 scope.go:117] "RemoveContainer" containerID="9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.967807 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities\") pod \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.967851 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmxl\" (UniqueName: \"kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl\") pod \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.968395 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content\") pod \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\" (UID: \"b8b2c098-ed87-4e2c-bf6f-bc95a8228f70\") " Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.968673 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities" (OuterVolumeSpecName: "utilities") pod "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" (UID: "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.972917 4739 scope.go:117] "RemoveContainer" containerID="964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0" Oct 08 22:03:09 crc kubenswrapper[4739]: E1008 22:03:09.973513 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0\": container with ID starting with 964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0 not found: ID does not exist" containerID="964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.973649 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0"} err="failed to get container status \"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0\": rpc error: code = NotFound desc = could not find container \"964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0\": container with ID starting with 964628a28b70b0a3553f0aecc9ffc264fdcc0a62bad4f6345700f843be9c5ec0 not found: ID does not exist" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.973769 4739 scope.go:117] "RemoveContainer" containerID="ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e" Oct 08 22:03:09 crc kubenswrapper[4739]: E1008 22:03:09.974242 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e\": container with ID starting with ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e not found: ID does not exist" containerID="ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.974336 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e"} err="failed to get container status \"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e\": rpc error: code = NotFound desc = could not find container \"ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e\": container with ID starting with ea0fcd6f0954a5bbc485f6a19d877a846a1e3660f97706a794e250547634f73e not found: ID does not exist" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.974415 4739 scope.go:117] "RemoveContainer" containerID="9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323" Oct 08 22:03:09 crc kubenswrapper[4739]: E1008 22:03:09.974864 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323\": container with ID starting with 9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323 not found: ID does not exist" containerID="9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.975101 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323"} err="failed to get container status \"9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323\": rpc error: code = NotFound desc = could not find container \"9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323\": container with ID starting with 9e27e864eca043c1e9064692bba8faaec960940150d6df2b8bfdc100daec9323 not found: ID does not exist" Oct 08 22:03:09 crc kubenswrapper[4739]: I1008 22:03:09.976363 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl" (OuterVolumeSpecName: "kube-api-access-mpmxl") pod "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" (UID: "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70"). InnerVolumeSpecName "kube-api-access-mpmxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:03:10 crc kubenswrapper[4739]: I1008 22:03:10.070063 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmxl\" (UniqueName: \"kubernetes.io/projected/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-kube-api-access-mpmxl\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:10 crc kubenswrapper[4739]: I1008 22:03:10.070116 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:10 crc kubenswrapper[4739]: I1008 22:03:10.897690 4739 generic.go:334] "Generic (PLEG): container finished" podID="545c9b85-f531-4665-ba7c-8997de325d62" containerID="ba9e155f9a55ec2f072ddc956cb619045927d53f798ec6306d8bcb6556c1ca7c" exitCode=0 Oct 08 22:03:10 crc kubenswrapper[4739]: I1008 22:03:10.897741 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerDied","Data":"ba9e155f9a55ec2f072ddc956cb619045927d53f798ec6306d8bcb6556c1ca7c"} Oct 08 22:03:11 crc kubenswrapper[4739]: I1008 22:03:11.908225 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"78c4ff3cdd63e5d9e382c414b2dd7668852d3e649441f741ff916dcb152a3656"} Oct 08 22:03:11 crc kubenswrapper[4739]: I1008 22:03:11.912716 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" (UID: "b8b2c098-ed87-4e2c-bf6f-bc95a8228f70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:11 crc kubenswrapper[4739]: I1008 22:03:11.994835 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.048337 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.052454 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vhpw"] Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.919050 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"ebc4eff8391f02201c14b828fe6dbd8c63f553a46f150d23f1f0b52d45ad4a9d"} Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.919397 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"960dffdaf79dcff9a2f2e68c20fecbbd8a68fd617f2c46da7990a20b8e174913"} Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.919409 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"e09bf63e4883b63adf23f29b0c081198d9a45e41761c5035d4038824a38ebc5d"} Oct 08 22:03:12 crc kubenswrapper[4739]: I1008 22:03:12.919426 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"3998beadc1c7ed6703f557723e6f0f0dda9142bf009b639eb252e6d96b2deb09"} Oct 08 22:03:13 crc kubenswrapper[4739]: I1008 22:03:13.829517 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" path="/var/lib/kubelet/pods/b8b2c098-ed87-4e2c-bf6f-bc95a8228f70/volumes" Oct 08 22:03:14 crc kubenswrapper[4739]: I1008 22:03:14.935235 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-phdnx" event={"ID":"545c9b85-f531-4665-ba7c-8997de325d62","Type":"ContainerStarted","Data":"1f7266d21414fdf67034401d7a92bf92892e10575a8a0e433b4ae9f6f2af1d00"} Oct 08 22:03:14 crc kubenswrapper[4739]: I1008 22:03:14.936194 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:03:14 crc kubenswrapper[4739]: I1008 22:03:14.972251 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-phdnx" podStartSLOduration=8.379953976 podStartE2EDuration="17.972226706s" podCreationTimestamp="2025-10-08 22:02:57 +0000 UTC" firstStartedPulling="2025-10-08 22:02:58.297064268 +0000 UTC m=+878.122450018" lastFinishedPulling="2025-10-08 22:03:07.889336988 +0000 UTC m=+887.714722748" observedRunningTime="2025-10-08 22:03:14.959884204 +0000 UTC m=+894.785270004" watchObservedRunningTime="2025-10-08 22:03:14.972226706 +0000 UTC m=+894.797612496" Oct 08 22:03:18 crc kubenswrapper[4739]: I1008 22:03:18.178792 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:03:18 crc kubenswrapper[4739]: I1008 22:03:18.213907 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:03:18 crc kubenswrapper[4739]: I1008 22:03:18.798394 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-rmj6d" Oct 08 22:03:19 crc kubenswrapper[4739]: I1008 22:03:19.765065 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kn5nb" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.640681 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.641056 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="extract-content" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.641077 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="extract-content" Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.641101 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="extract-content" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.641114 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="extract-content" Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.641134 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="extract-utilities" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646253 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="extract-utilities" Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.646300 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="extract-utilities" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646311 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="extract-utilities" Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.646319 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646328 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: E1008 22:03:22.646358 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646368 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646582 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd6ae49-2147-4189-b8de-dc3a8e24e297" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.646608 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b2c098-ed87-4e2c-bf6f-bc95a8228f70" containerName="registry-server" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.647126 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.648923 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.649764 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.650530 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4btzt" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.663116 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.715590 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhzm\" (UniqueName: \"kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm\") pod \"openstack-operator-index-czrgg\" (UID: \"3badf60b-1861-41bd-ac37-fd9badc6e785\") " pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.816536 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhzm\" (UniqueName: \"kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm\") pod \"openstack-operator-index-czrgg\" (UID: \"3badf60b-1861-41bd-ac37-fd9badc6e785\") " pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.832799 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhzm\" (UniqueName: \"kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm\") pod \"openstack-operator-index-czrgg\" (UID: \"3badf60b-1861-41bd-ac37-fd9badc6e785\") " pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:22 crc kubenswrapper[4739]: I1008 22:03:22.975921 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:23 crc kubenswrapper[4739]: I1008 22:03:23.354277 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:23 crc kubenswrapper[4739]: W1008 22:03:23.363446 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3badf60b_1861_41bd_ac37_fd9badc6e785.slice/crio-0c08e96920f05fda09fcf4675bfc4b619d528d2480160df2322e4c6a13fa7fe3 WatchSource:0}: Error finding container 0c08e96920f05fda09fcf4675bfc4b619d528d2480160df2322e4c6a13fa7fe3: Status 404 returned error can't find the container with id 0c08e96920f05fda09fcf4675bfc4b619d528d2480160df2322e4c6a13fa7fe3 Oct 08 22:03:24 crc kubenswrapper[4739]: I1008 22:03:24.017169 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-czrgg" event={"ID":"3badf60b-1861-41bd-ac37-fd9badc6e785","Type":"ContainerStarted","Data":"0c08e96920f05fda09fcf4675bfc4b619d528d2480160df2322e4c6a13fa7fe3"} Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.001844 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.808041 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wfp9g"] Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.808741 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.819255 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wfp9g"] Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.871505 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgm4\" (UniqueName: \"kubernetes.io/projected/99e801cf-5f88-48ef-8193-516d2cc2bf14-kube-api-access-drgm4\") pod \"openstack-operator-index-wfp9g\" (UID: \"99e801cf-5f88-48ef-8193-516d2cc2bf14\") " pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:26 crc kubenswrapper[4739]: I1008 22:03:26.973477 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drgm4\" (UniqueName: \"kubernetes.io/projected/99e801cf-5f88-48ef-8193-516d2cc2bf14-kube-api-access-drgm4\") pod \"openstack-operator-index-wfp9g\" (UID: \"99e801cf-5f88-48ef-8193-516d2cc2bf14\") " pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:27 crc kubenswrapper[4739]: I1008 22:03:27.013661 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgm4\" (UniqueName: \"kubernetes.io/projected/99e801cf-5f88-48ef-8193-516d2cc2bf14-kube-api-access-drgm4\") pod \"openstack-operator-index-wfp9g\" (UID: \"99e801cf-5f88-48ef-8193-516d2cc2bf14\") " pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:27 crc kubenswrapper[4739]: I1008 22:03:27.132215 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:27 crc kubenswrapper[4739]: I1008 22:03:27.643736 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wfp9g"] Oct 08 22:03:27 crc kubenswrapper[4739]: W1008 22:03:27.660299 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e801cf_5f88_48ef_8193_516d2cc2bf14.slice/crio-200584611b840a15eab3c935f69741e527898797b1bad64a82ce4eba94a2be8a WatchSource:0}: Error finding container 200584611b840a15eab3c935f69741e527898797b1bad64a82ce4eba94a2be8a: Status 404 returned error can't find the container with id 200584611b840a15eab3c935f69741e527898797b1bad64a82ce4eba94a2be8a Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.055727 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfp9g" event={"ID":"99e801cf-5f88-48ef-8193-516d2cc2bf14","Type":"ContainerStarted","Data":"cfc4507b76e14f8af12e819b3f096f1816c03835c4398a437f71ccf30d06f42f"} Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.056281 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfp9g" event={"ID":"99e801cf-5f88-48ef-8193-516d2cc2bf14","Type":"ContainerStarted","Data":"200584611b840a15eab3c935f69741e527898797b1bad64a82ce4eba94a2be8a"} Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.057491 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-czrgg" event={"ID":"3badf60b-1861-41bd-ac37-fd9badc6e785","Type":"ContainerStarted","Data":"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32"} Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.057625 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-czrgg" podUID="3badf60b-1861-41bd-ac37-fd9badc6e785" containerName="registry-server" containerID="cri-o://bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32" gracePeriod=2 Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.083076 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wfp9g" podStartSLOduration=2.028219431 podStartE2EDuration="2.083055305s" podCreationTimestamp="2025-10-08 22:03:26 +0000 UTC" firstStartedPulling="2025-10-08 22:03:27.664252714 +0000 UTC m=+907.489638464" lastFinishedPulling="2025-10-08 22:03:27.719088588 +0000 UTC m=+907.544474338" observedRunningTime="2025-10-08 22:03:28.079750345 +0000 UTC m=+907.905136105" watchObservedRunningTime="2025-10-08 22:03:28.083055305 +0000 UTC m=+907.908441075" Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.104261 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-czrgg" podStartSLOduration=2.218563892 podStartE2EDuration="6.104223814s" podCreationTimestamp="2025-10-08 22:03:22 +0000 UTC" firstStartedPulling="2025-10-08 22:03:23.364830544 +0000 UTC m=+903.190216294" lastFinishedPulling="2025-10-08 22:03:27.250490456 +0000 UTC m=+907.075876216" observedRunningTime="2025-10-08 22:03:28.100646516 +0000 UTC m=+907.926032286" watchObservedRunningTime="2025-10-08 22:03:28.104223814 +0000 UTC m=+907.929609644" Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.185019 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-phdnx" Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.512354 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.698018 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jhzm\" (UniqueName: \"kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm\") pod \"3badf60b-1861-41bd-ac37-fd9badc6e785\" (UID: \"3badf60b-1861-41bd-ac37-fd9badc6e785\") " Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.709077 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm" (OuterVolumeSpecName: "kube-api-access-2jhzm") pod "3badf60b-1861-41bd-ac37-fd9badc6e785" (UID: "3badf60b-1861-41bd-ac37-fd9badc6e785"). InnerVolumeSpecName "kube-api-access-2jhzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:03:28 crc kubenswrapper[4739]: I1008 22:03:28.799717 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jhzm\" (UniqueName: \"kubernetes.io/projected/3badf60b-1861-41bd-ac37-fd9badc6e785-kube-api-access-2jhzm\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.087946 4739 generic.go:334] "Generic (PLEG): container finished" podID="3badf60b-1861-41bd-ac37-fd9badc6e785" containerID="bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32" exitCode=0 Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.088027 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-czrgg" event={"ID":"3badf60b-1861-41bd-ac37-fd9badc6e785","Type":"ContainerDied","Data":"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32"} Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.088105 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-czrgg" event={"ID":"3badf60b-1861-41bd-ac37-fd9badc6e785","Type":"ContainerDied","Data":"0c08e96920f05fda09fcf4675bfc4b619d528d2480160df2322e4c6a13fa7fe3"} Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.088127 4739 scope.go:117] "RemoveContainer" containerID="bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32" Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.088922 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-czrgg" Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.149598 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.155060 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-czrgg"] Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.443822 4739 scope.go:117] "RemoveContainer" containerID="bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32" Oct 08 22:03:29 crc kubenswrapper[4739]: E1008 22:03:29.444396 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32\": container with ID starting with bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32 not found: ID does not exist" containerID="bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32" Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.444480 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32"} err="failed to get container status \"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32\": rpc error: code = NotFound desc = could not find container \"bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32\": container with ID starting with bd43bcbb7422ded0fe5f19baf2d7ca4fe7f79435991f3513176a2287fcecda32 not found: ID does not exist" Oct 08 22:03:29 crc kubenswrapper[4739]: I1008 22:03:29.829129 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3badf60b-1861-41bd-ac37-fd9badc6e785" path="/var/lib/kubelet/pods/3badf60b-1861-41bd-ac37-fd9badc6e785/volumes" Oct 08 22:03:37 crc kubenswrapper[4739]: I1008 22:03:37.132978 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:37 crc kubenswrapper[4739]: I1008 22:03:37.133773 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:37 crc kubenswrapper[4739]: I1008 22:03:37.162795 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:37 crc kubenswrapper[4739]: I1008 22:03:37.191423 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wfp9g" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.870359 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt"] Oct 08 22:03:51 crc kubenswrapper[4739]: E1008 22:03:51.872105 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3badf60b-1861-41bd-ac37-fd9badc6e785" containerName="registry-server" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.872135 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3badf60b-1861-41bd-ac37-fd9badc6e785" containerName="registry-server" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.872450 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3badf60b-1861-41bd-ac37-fd9badc6e785" containerName="registry-server" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.874295 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.877025 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nrb5b" Oct 08 22:03:51 crc kubenswrapper[4739]: I1008 22:03:51.887742 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt"] Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.031134 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.031282 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.031340 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnd7h\" (UniqueName: \"kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.132378 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.132451 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.132495 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnd7h\" (UniqueName: \"kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.133480 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.133545 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.172457 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnd7h\" (UniqueName: \"kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h\") pod \"f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.206871 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:52 crc kubenswrapper[4739]: I1008 22:03:52.703563 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt"] Oct 08 22:03:52 crc kubenswrapper[4739]: W1008 22:03:52.711181 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf9c162_c45e_48c6_9415_4f6e218895c0.slice/crio-1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356 WatchSource:0}: Error finding container 1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356: Status 404 returned error can't find the container with id 1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356 Oct 08 22:03:53 crc kubenswrapper[4739]: I1008 22:03:53.277357 4739 generic.go:334] "Generic (PLEG): container finished" podID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerID="6ae219d18b207ddc25509d6d49501e67cc3802d462e9b9b4af4ebbaee8cd6b35" exitCode=0 Oct 08 22:03:53 crc kubenswrapper[4739]: I1008 22:03:53.277423 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" event={"ID":"5bf9c162-c45e-48c6-9415-4f6e218895c0","Type":"ContainerDied","Data":"6ae219d18b207ddc25509d6d49501e67cc3802d462e9b9b4af4ebbaee8cd6b35"} Oct 08 22:03:53 crc kubenswrapper[4739]: I1008 22:03:53.277590 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" event={"ID":"5bf9c162-c45e-48c6-9415-4f6e218895c0","Type":"ContainerStarted","Data":"1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356"} Oct 08 22:03:56 crc kubenswrapper[4739]: I1008 22:03:56.300208 4739 generic.go:334] "Generic (PLEG): container finished" podID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerID="1974c3caa9ff192b725247847cb08c0bc32355aed5cd367fcba8f3a6d36ddd12" exitCode=0 Oct 08 22:03:56 crc kubenswrapper[4739]: I1008 22:03:56.300293 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" event={"ID":"5bf9c162-c45e-48c6-9415-4f6e218895c0","Type":"ContainerDied","Data":"1974c3caa9ff192b725247847cb08c0bc32355aed5cd367fcba8f3a6d36ddd12"} Oct 08 22:03:57 crc kubenswrapper[4739]: I1008 22:03:57.310548 4739 generic.go:334] "Generic (PLEG): container finished" podID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerID="baa6ac8d0b142e8b09c759e31edccaacbd1bc8ecb9cf08d6455b28aaa70e6057" exitCode=0 Oct 08 22:03:57 crc kubenswrapper[4739]: I1008 22:03:57.310656 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" event={"ID":"5bf9c162-c45e-48c6-9415-4f6e218895c0","Type":"ContainerDied","Data":"baa6ac8d0b142e8b09c759e31edccaacbd1bc8ecb9cf08d6455b28aaa70e6057"} Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.628525 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.726803 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle\") pod \"5bf9c162-c45e-48c6-9415-4f6e218895c0\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.726943 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util\") pod \"5bf9c162-c45e-48c6-9415-4f6e218895c0\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.727036 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnd7h\" (UniqueName: \"kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h\") pod \"5bf9c162-c45e-48c6-9415-4f6e218895c0\" (UID: \"5bf9c162-c45e-48c6-9415-4f6e218895c0\") " Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.728050 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle" (OuterVolumeSpecName: "bundle") pod "5bf9c162-c45e-48c6-9415-4f6e218895c0" (UID: "5bf9c162-c45e-48c6-9415-4f6e218895c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.734202 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h" (OuterVolumeSpecName: "kube-api-access-qnd7h") pod "5bf9c162-c45e-48c6-9415-4f6e218895c0" (UID: "5bf9c162-c45e-48c6-9415-4f6e218895c0"). InnerVolumeSpecName "kube-api-access-qnd7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.738340 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util" (OuterVolumeSpecName: "util") pod "5bf9c162-c45e-48c6-9415-4f6e218895c0" (UID: "5bf9c162-c45e-48c6-9415-4f6e218895c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.828772 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.828989 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bf9c162-c45e-48c6-9415-4f6e218895c0-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:58 crc kubenswrapper[4739]: I1008 22:03:58.829003 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnd7h\" (UniqueName: \"kubernetes.io/projected/5bf9c162-c45e-48c6-9415-4f6e218895c0-kube-api-access-qnd7h\") on node \"crc\" DevicePath \"\"" Oct 08 22:03:59 crc kubenswrapper[4739]: I1008 22:03:59.327093 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" event={"ID":"5bf9c162-c45e-48c6-9415-4f6e218895c0","Type":"ContainerDied","Data":"1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356"} Oct 08 22:03:59 crc kubenswrapper[4739]: I1008 22:03:59.327138 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7106a3b47749d7fee34b6c33167cec7c6c39ac572a619dd660c638e467f356" Oct 08 22:03:59 crc kubenswrapper[4739]: I1008 22:03:59.327209 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.555709 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp"] Oct 08 22:04:04 crc kubenswrapper[4739]: E1008 22:04:04.556233 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="util" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.556247 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="util" Oct 08 22:04:04 crc kubenswrapper[4739]: E1008 22:04:04.556265 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="pull" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.556272 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="pull" Oct 08 22:04:04 crc kubenswrapper[4739]: E1008 22:04:04.556289 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="extract" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.556298 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="extract" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.556417 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf9c162-c45e-48c6-9415-4f6e218895c0" containerName="extract" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.557174 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.558762 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-z585n" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.573262 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp"] Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.706570 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srlm\" (UniqueName: \"kubernetes.io/projected/f65a9137-93d0-424e-a839-3429f141ffa7-kube-api-access-7srlm\") pod \"openstack-operator-controller-operator-7fbb97f6f4-lh6lp\" (UID: \"f65a9137-93d0-424e-a839-3429f141ffa7\") " pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.808173 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srlm\" (UniqueName: \"kubernetes.io/projected/f65a9137-93d0-424e-a839-3429f141ffa7-kube-api-access-7srlm\") pod \"openstack-operator-controller-operator-7fbb97f6f4-lh6lp\" (UID: \"f65a9137-93d0-424e-a839-3429f141ffa7\") " pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.829551 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srlm\" (UniqueName: \"kubernetes.io/projected/f65a9137-93d0-424e-a839-3429f141ffa7-kube-api-access-7srlm\") pod \"openstack-operator-controller-operator-7fbb97f6f4-lh6lp\" (UID: \"f65a9137-93d0-424e-a839-3429f141ffa7\") " pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:04 crc kubenswrapper[4739]: I1008 22:04:04.872607 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:05 crc kubenswrapper[4739]: I1008 22:04:05.126947 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp"] Oct 08 22:04:05 crc kubenswrapper[4739]: I1008 22:04:05.366002 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" event={"ID":"f65a9137-93d0-424e-a839-3429f141ffa7","Type":"ContainerStarted","Data":"a2a66733cbe5aceb4082299ec742469e94d683522cabf97a4c208b5f5ac9d607"} Oct 08 22:04:09 crc kubenswrapper[4739]: I1008 22:04:09.522853 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:04:10 crc kubenswrapper[4739]: I1008 22:04:10.405047 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" event={"ID":"f65a9137-93d0-424e-a839-3429f141ffa7","Type":"ContainerStarted","Data":"dda04bc5bbeb19188b2520cac968339f5c7abc3614d15f9873220c33fdb58ad9"} Oct 08 22:04:14 crc kubenswrapper[4739]: I1008 22:04:14.433763 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" event={"ID":"f65a9137-93d0-424e-a839-3429f141ffa7","Type":"ContainerStarted","Data":"326f6a90fa17d90eead8bbc7f95fe4c8b107ba6491bdc25eb3e2843c0530e1f2"} Oct 08 22:04:14 crc kubenswrapper[4739]: I1008 22:04:14.434342 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:14 crc kubenswrapper[4739]: I1008 22:04:14.436161 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" Oct 08 22:04:14 crc kubenswrapper[4739]: I1008 22:04:14.522616 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7fbb97f6f4-lh6lp" podStartSLOduration=1.948502121 podStartE2EDuration="10.522589955s" podCreationTimestamp="2025-10-08 22:04:04 +0000 UTC" firstStartedPulling="2025-10-08 22:04:05.128563445 +0000 UTC m=+944.953949205" lastFinishedPulling="2025-10-08 22:04:13.702651289 +0000 UTC m=+953.528037039" observedRunningTime="2025-10-08 22:04:14.478065652 +0000 UTC m=+954.303451492" watchObservedRunningTime="2025-10-08 22:04:14.522589955 +0000 UTC m=+954.347975735" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.957573 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc"] Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.959510 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.965000 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf"] Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.966355 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.970924 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-s6gv2" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.972811 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4sfsw" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.973267 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc"] Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.979415 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6"] Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.980654 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.983813 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-24lxv" Oct 08 22:04:41 crc kubenswrapper[4739]: I1008 22:04:41.988479 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.008757 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.009991 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.012610 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-nfbxz" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.024843 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.031971 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.032946 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.039130 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xb5rr" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.045199 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.063085 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.066446 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hpb\" (UniqueName: \"kubernetes.io/projected/4ecad090-144b-491d-9307-dd0d2db07490-kube-api-access-c4hpb\") pod \"barbican-operator-controller-manager-64f84fcdbb-2c9sc\" (UID: \"4ecad090-144b-491d-9307-dd0d2db07490\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.077757 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.079303 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.084783 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4tmpw" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.089732 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.105293 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.107058 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.108834 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.110589 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xmg59" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.116357 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.117282 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.127886 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gkvng" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.131211 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.132350 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.133780 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5w42g" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.139315 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.150271 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-l66d4"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.151691 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.158601 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.159785 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2662q" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.167392 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnpv\" (UniqueName: \"kubernetes.io/projected/5aa8c8d1-9588-4e0f-87e2-b44b072bef76-kube-api-access-scnpv\") pod \"cinder-operator-controller-manager-59cdc64769-zk8hf\" (UID: \"5aa8c8d1-9588-4e0f-87e2-b44b072bef76\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.167449 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6gn5\" (UniqueName: \"kubernetes.io/projected/fae90c53-9891-4664-8767-98bfab1e021a-kube-api-access-r6gn5\") pod \"heat-operator-controller-manager-6d9967f8dd-xnfkp\" (UID: \"fae90c53-9891-4664-8767-98bfab1e021a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.167479 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hpb\" (UniqueName: \"kubernetes.io/projected/4ecad090-144b-491d-9307-dd0d2db07490-kube-api-access-c4hpb\") pod \"barbican-operator-controller-manager-64f84fcdbb-2c9sc\" (UID: \"4ecad090-144b-491d-9307-dd0d2db07490\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.168706 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzlqd\" (UniqueName: \"kubernetes.io/projected/8302b913-c934-4911-8c78-72d139019f33-kube-api-access-kzlqd\") pod \"glance-operator-controller-manager-7bb46cd7d-wqmv7\" (UID: \"8302b913-c934-4911-8c78-72d139019f33\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.168911 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9tj\" (UniqueName: \"kubernetes.io/projected/17ff0c7d-5595-4d2f-b77d-0f6114746fae-kube-api-access-zg9tj\") pod \"designate-operator-controller-manager-687df44cdb-qhqv6\" (UID: \"17ff0c7d-5595-4d2f-b77d-0f6114746fae\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.175532 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.178691 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.179892 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.181596 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-p6kpr" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.189373 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-l66d4"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.203581 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.204079 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hpb\" (UniqueName: \"kubernetes.io/projected/4ecad090-144b-491d-9307-dd0d2db07490-kube-api-access-c4hpb\") pod \"barbican-operator-controller-manager-64f84fcdbb-2c9sc\" (UID: \"4ecad090-144b-491d-9307-dd0d2db07490\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.208064 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.226736 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.247065 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kthq2" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.282696 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285111 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcf2\" (UniqueName: \"kubernetes.io/projected/dcd83df0-3381-4d08-9818-e7f91ba6f77b-kube-api-access-rqcf2\") pod \"horizon-operator-controller-manager-6d74794d9b-8djd9\" (UID: \"dcd83df0-3381-4d08-9818-e7f91ba6f77b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285183 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285202 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qwb\" (UniqueName: \"kubernetes.io/projected/c11fdec0-87d4-41db-b5d3-66155b578abe-kube-api-access-c9qwb\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285238 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9tj\" (UniqueName: \"kubernetes.io/projected/17ff0c7d-5595-4d2f-b77d-0f6114746fae-kube-api-access-zg9tj\") pod \"designate-operator-controller-manager-687df44cdb-qhqv6\" (UID: \"17ff0c7d-5595-4d2f-b77d-0f6114746fae\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285281 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scnpv\" (UniqueName: \"kubernetes.io/projected/5aa8c8d1-9588-4e0f-87e2-b44b072bef76-kube-api-access-scnpv\") pod \"cinder-operator-controller-manager-59cdc64769-zk8hf\" (UID: \"5aa8c8d1-9588-4e0f-87e2-b44b072bef76\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285299 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6gn5\" (UniqueName: \"kubernetes.io/projected/fae90c53-9891-4664-8767-98bfab1e021a-kube-api-access-r6gn5\") pod \"heat-operator-controller-manager-6d9967f8dd-xnfkp\" (UID: \"fae90c53-9891-4664-8767-98bfab1e021a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285317 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp9cg\" (UniqueName: \"kubernetes.io/projected/58ceaa51-6704-4f1d-8aa0-2053f1c7c89d-kube-api-access-bp9cg\") pod \"keystone-operator-controller-manager-ddb98f99b-9ls4w\" (UID: \"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285333 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvgz\" (UniqueName: \"kubernetes.io/projected/f038b58d-e69a-481d-a0df-65211386c9da-kube-api-access-zlvgz\") pod \"ironic-operator-controller-manager-74cb5cbc49-6kp4x\" (UID: \"f038b58d-e69a-481d-a0df-65211386c9da\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285363 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzlqd\" (UniqueName: \"kubernetes.io/projected/8302b913-c934-4911-8c78-72d139019f33-kube-api-access-kzlqd\") pod \"glance-operator-controller-manager-7bb46cd7d-wqmv7\" (UID: \"8302b913-c934-4911-8c78-72d139019f33\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.285380 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrpd\" (UniqueName: \"kubernetes.io/projected/cf67473b-9a22-492a-844d-552fc946605d-kube-api-access-mwrpd\") pod \"manila-operator-controller-manager-59578bc799-l66d4\" (UID: \"cf67473b-9a22-492a-844d-552fc946605d\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.299560 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.306251 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.306841 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6gn5\" (UniqueName: \"kubernetes.io/projected/fae90c53-9891-4664-8767-98bfab1e021a-kube-api-access-r6gn5\") pod \"heat-operator-controller-manager-6d9967f8dd-xnfkp\" (UID: \"fae90c53-9891-4664-8767-98bfab1e021a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.307558 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.313761 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnpv\" (UniqueName: \"kubernetes.io/projected/5aa8c8d1-9588-4e0f-87e2-b44b072bef76-kube-api-access-scnpv\") pod \"cinder-operator-controller-manager-59cdc64769-zk8hf\" (UID: \"5aa8c8d1-9588-4e0f-87e2-b44b072bef76\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.313973 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-82lc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.321095 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzlqd\" (UniqueName: \"kubernetes.io/projected/8302b913-c934-4911-8c78-72d139019f33-kube-api-access-kzlqd\") pod \"glance-operator-controller-manager-7bb46cd7d-wqmv7\" (UID: \"8302b913-c934-4911-8c78-72d139019f33\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.321679 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9tj\" (UniqueName: \"kubernetes.io/projected/17ff0c7d-5595-4d2f-b77d-0f6114746fae-kube-api-access-zg9tj\") pod \"designate-operator-controller-manager-687df44cdb-qhqv6\" (UID: \"17ff0c7d-5595-4d2f-b77d-0f6114746fae\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.321751 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.322898 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.325965 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.327036 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7pwjv" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.345034 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.356793 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.358321 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.357055 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.362916 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5lnnj" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.366712 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.367915 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.372287 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.372611 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4gvwq" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.372943 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.375782 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.385835 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386226 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrpd\" (UniqueName: \"kubernetes.io/projected/cf67473b-9a22-492a-844d-552fc946605d-kube-api-access-mwrpd\") pod \"manila-operator-controller-manager-59578bc799-l66d4\" (UID: \"cf67473b-9a22-492a-844d-552fc946605d\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386258 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcf2\" (UniqueName: \"kubernetes.io/projected/dcd83df0-3381-4d08-9818-e7f91ba6f77b-kube-api-access-rqcf2\") pod \"horizon-operator-controller-manager-6d74794d9b-8djd9\" (UID: \"dcd83df0-3381-4d08-9818-e7f91ba6f77b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386280 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-952vg\" (UniqueName: \"kubernetes.io/projected/23d030c7-6a61-4ba5-9b00-018f7370ea5d-kube-api-access-952vg\") pod \"nova-operator-controller-manager-57bb74c7bf-8d2bb\" (UID: \"23d030c7-6a61-4ba5-9b00-018f7370ea5d\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386310 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qwb\" (UniqueName: \"kubernetes.io/projected/c11fdec0-87d4-41db-b5d3-66155b578abe-kube-api-access-c9qwb\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386328 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386349 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkcqp\" (UniqueName: \"kubernetes.io/projected/102e8f33-2000-4a71-a337-4fa304d59e93-kube-api-access-nkcqp\") pod \"mariadb-operator-controller-manager-5777b4f897-2sv26\" (UID: \"102e8f33-2000-4a71-a337-4fa304d59e93\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386374 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64wmd\" (UniqueName: \"kubernetes.io/projected/7e483c3d-debb-4f41-a968-0d19d337e771-kube-api-access-64wmd\") pod \"neutron-operator-controller-manager-797d478b46-fbhjp\" (UID: \"7e483c3d-debb-4f41-a968-0d19d337e771\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386402 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhnl\" (UniqueName: \"kubernetes.io/projected/41df7676-d0f5-47a1-a90c-2bc3bc01e18d-kube-api-access-vjhnl\") pod \"ovn-operator-controller-manager-6f96f8c84-qn2t5\" (UID: \"41df7676-d0f5-47a1-a90c-2bc3bc01e18d\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386423 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4b6c\" (UniqueName: \"kubernetes.io/projected/c165e9bc-4624-4227-8a87-835cbfe8a970-kube-api-access-d4b6c\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386446 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2sq7\" (UniqueName: \"kubernetes.io/projected/d0a97575-d460-410d-84aa-887e6d809bba-kube-api-access-b2sq7\") pod \"octavia-operator-controller-manager-6d7c7ddf95-wz5dg\" (UID: \"d0a97575-d460-410d-84aa-887e6d809bba\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386465 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp9cg\" (UniqueName: \"kubernetes.io/projected/58ceaa51-6704-4f1d-8aa0-2053f1c7c89d-kube-api-access-bp9cg\") pod \"keystone-operator-controller-manager-ddb98f99b-9ls4w\" (UID: \"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386480 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvgz\" (UniqueName: \"kubernetes.io/projected/f038b58d-e69a-481d-a0df-65211386c9da-kube-api-access-zlvgz\") pod \"ironic-operator-controller-manager-74cb5cbc49-6kp4x\" (UID: \"f038b58d-e69a-481d-a0df-65211386c9da\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.386497 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: E1008 22:04:42.386828 4739 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 22:04:42 crc kubenswrapper[4739]: E1008 22:04:42.386865 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert podName:c11fdec0-87d4-41db-b5d3-66155b578abe nodeName:}" failed. No retries permitted until 2025-10-08 22:04:42.886850298 +0000 UTC m=+982.712236048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert") pod "infra-operator-controller-manager-585fc5b659-f7hv4" (UID: "c11fdec0-87d4-41db-b5d3-66155b578abe") : secret "infra-operator-webhook-server-cert" not found Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.387252 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.412368 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.413631 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.421302 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6blbj" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.421741 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bfw4q" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.450564 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcf2\" (UniqueName: \"kubernetes.io/projected/dcd83df0-3381-4d08-9818-e7f91ba6f77b-kube-api-access-rqcf2\") pod \"horizon-operator-controller-manager-6d74794d9b-8djd9\" (UID: \"dcd83df0-3381-4d08-9818-e7f91ba6f77b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.452290 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvgz\" (UniqueName: \"kubernetes.io/projected/f038b58d-e69a-481d-a0df-65211386c9da-kube-api-access-zlvgz\") pod \"ironic-operator-controller-manager-74cb5cbc49-6kp4x\" (UID: \"f038b58d-e69a-481d-a0df-65211386c9da\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.454807 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qwb\" (UniqueName: \"kubernetes.io/projected/c11fdec0-87d4-41db-b5d3-66155b578abe-kube-api-access-c9qwb\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.465806 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrpd\" (UniqueName: \"kubernetes.io/projected/cf67473b-9a22-492a-844d-552fc946605d-kube-api-access-mwrpd\") pod \"manila-operator-controller-manager-59578bc799-l66d4\" (UID: \"cf67473b-9a22-492a-844d-552fc946605d\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.466449 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp9cg\" (UniqueName: \"kubernetes.io/projected/58ceaa51-6704-4f1d-8aa0-2053f1c7c89d-kube-api-access-bp9cg\") pod \"keystone-operator-controller-manager-ddb98f99b-9ls4w\" (UID: \"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.466743 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.467897 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.478684 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.478971 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.486463 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487858 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64wmd\" (UniqueName: \"kubernetes.io/projected/7e483c3d-debb-4f41-a968-0d19d337e771-kube-api-access-64wmd\") pod \"neutron-operator-controller-manager-797d478b46-fbhjp\" (UID: \"7e483c3d-debb-4f41-a968-0d19d337e771\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487894 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhnl\" (UniqueName: \"kubernetes.io/projected/41df7676-d0f5-47a1-a90c-2bc3bc01e18d-kube-api-access-vjhnl\") pod \"ovn-operator-controller-manager-6f96f8c84-qn2t5\" (UID: \"41df7676-d0f5-47a1-a90c-2bc3bc01e18d\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487917 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4b6c\" (UniqueName: \"kubernetes.io/projected/c165e9bc-4624-4227-8a87-835cbfe8a970-kube-api-access-d4b6c\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487942 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sq7\" (UniqueName: \"kubernetes.io/projected/d0a97575-d460-410d-84aa-887e6d809bba-kube-api-access-b2sq7\") pod \"octavia-operator-controller-manager-6d7c7ddf95-wz5dg\" (UID: \"d0a97575-d460-410d-84aa-887e6d809bba\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487960 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.487988 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rffpw\" (UniqueName: \"kubernetes.io/projected/e51cba46-23fd-4f5f-819d-c2e0ee77a743-kube-api-access-rffpw\") pod \"placement-operator-controller-manager-664664cb68-9vz2h\" (UID: \"e51cba46-23fd-4f5f-819d-c2e0ee77a743\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.488018 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-952vg\" (UniqueName: \"kubernetes.io/projected/23d030c7-6a61-4ba5-9b00-018f7370ea5d-kube-api-access-952vg\") pod \"nova-operator-controller-manager-57bb74c7bf-8d2bb\" (UID: \"23d030c7-6a61-4ba5-9b00-018f7370ea5d\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.488060 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfb8\" (UniqueName: \"kubernetes.io/projected/dd2b6037-9b5d-47cb-b057-d33546b8e74c-kube-api-access-pnfb8\") pod \"swift-operator-controller-manager-5f4d5dfdc6-7gjhx\" (UID: \"dd2b6037-9b5d-47cb-b057-d33546b8e74c\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.488079 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkcqp\" (UniqueName: \"kubernetes.io/projected/102e8f33-2000-4a71-a337-4fa304d59e93-kube-api-access-nkcqp\") pod \"mariadb-operator-controller-manager-5777b4f897-2sv26\" (UID: \"102e8f33-2000-4a71-a337-4fa304d59e93\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:04:42 crc kubenswrapper[4739]: E1008 22:04:42.489434 4739 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:04:42 crc kubenswrapper[4739]: E1008 22:04:42.489475 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert podName:c165e9bc-4624-4227-8a87-835cbfe8a970 nodeName:}" failed. No retries permitted until 2025-10-08 22:04:42.989461137 +0000 UTC m=+982.814846887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" (UID: "c165e9bc-4624-4227-8a87-835cbfe8a970") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.526633 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhnl\" (UniqueName: \"kubernetes.io/projected/41df7676-d0f5-47a1-a90c-2bc3bc01e18d-kube-api-access-vjhnl\") pod \"ovn-operator-controller-manager-6f96f8c84-qn2t5\" (UID: \"41df7676-d0f5-47a1-a90c-2bc3bc01e18d\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.532746 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4b6c\" (UniqueName: \"kubernetes.io/projected/c165e9bc-4624-4227-8a87-835cbfe8a970-kube-api-access-d4b6c\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.556040 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkcqp\" (UniqueName: \"kubernetes.io/projected/102e8f33-2000-4a71-a337-4fa304d59e93-kube-api-access-nkcqp\") pod \"mariadb-operator-controller-manager-5777b4f897-2sv26\" (UID: \"102e8f33-2000-4a71-a337-4fa304d59e93\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.558271 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.559347 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.559778 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-952vg\" (UniqueName: \"kubernetes.io/projected/23d030c7-6a61-4ba5-9b00-018f7370ea5d-kube-api-access-952vg\") pod \"nova-operator-controller-manager-57bb74c7bf-8d2bb\" (UID: \"23d030c7-6a61-4ba5-9b00-018f7370ea5d\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.561055 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6qxrh" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.561788 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64wmd\" (UniqueName: \"kubernetes.io/projected/7e483c3d-debb-4f41-a968-0d19d337e771-kube-api-access-64wmd\") pod \"neutron-operator-controller-manager-797d478b46-fbhjp\" (UID: \"7e483c3d-debb-4f41-a968-0d19d337e771\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.561827 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2sq7\" (UniqueName: \"kubernetes.io/projected/d0a97575-d460-410d-84aa-887e6d809bba-kube-api-access-b2sq7\") pod \"octavia-operator-controller-manager-6d7c7ddf95-wz5dg\" (UID: \"d0a97575-d460-410d-84aa-887e6d809bba\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.588755 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.589853 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rffpw\" (UniqueName: \"kubernetes.io/projected/e51cba46-23fd-4f5f-819d-c2e0ee77a743-kube-api-access-rffpw\") pod \"placement-operator-controller-manager-664664cb68-9vz2h\" (UID: \"e51cba46-23fd-4f5f-819d-c2e0ee77a743\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.590009 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfb8\" (UniqueName: \"kubernetes.io/projected/dd2b6037-9b5d-47cb-b057-d33546b8e74c-kube-api-access-pnfb8\") pod \"swift-operator-controller-manager-5f4d5dfdc6-7gjhx\" (UID: \"dd2b6037-9b5d-47cb-b057-d33546b8e74c\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.590121 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b98r\" (UniqueName: \"kubernetes.io/projected/aff0dabb-b21e-4507-8a13-1d391b8c4f52-kube-api-access-7b98r\") pod \"telemetry-operator-controller-manager-75d7f5797c-czmc9\" (UID: \"aff0dabb-b21e-4507-8a13-1d391b8c4f52\") " pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.596962 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.631202 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.637664 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rffpw\" (UniqueName: \"kubernetes.io/projected/e51cba46-23fd-4f5f-819d-c2e0ee77a743-kube-api-access-rffpw\") pod \"placement-operator-controller-manager-664664cb68-9vz2h\" (UID: \"e51cba46-23fd-4f5f-819d-c2e0ee77a743\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.637902 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfb8\" (UniqueName: \"kubernetes.io/projected/dd2b6037-9b5d-47cb-b057-d33546b8e74c-kube-api-access-pnfb8\") pod \"swift-operator-controller-manager-5f4d5dfdc6-7gjhx\" (UID: \"dd2b6037-9b5d-47cb-b057-d33546b8e74c\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.653642 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.654946 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.656371 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.659339 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rsj49" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.691775 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b98r\" (UniqueName: \"kubernetes.io/projected/aff0dabb-b21e-4507-8a13-1d391b8c4f52-kube-api-access-7b98r\") pod \"telemetry-operator-controller-manager-75d7f5797c-czmc9\" (UID: \"aff0dabb-b21e-4507-8a13-1d391b8c4f52\") " pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.701550 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.701846 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.715848 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.716950 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.742806 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9nrxx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.746240 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b98r\" (UniqueName: \"kubernetes.io/projected/aff0dabb-b21e-4507-8a13-1d391b8c4f52-kube-api-access-7b98r\") pod \"telemetry-operator-controller-manager-75d7f5797c-czmc9\" (UID: \"aff0dabb-b21e-4507-8a13-1d391b8c4f52\") " pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.757369 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.758352 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.802458 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l2fp\" (UniqueName: \"kubernetes.io/projected/18bf5f4d-f183-41c2-b1c9-a965baab8f5d-kube-api-access-9l2fp\") pod \"test-operator-controller-manager-74665f6cdc-c6dtc\" (UID: \"18bf5f4d-f183-41c2-b1c9-a965baab8f5d\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.803057 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.829994 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.859616 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.867107 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.867610 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.886060 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.890207 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.891684 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.893614 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.894562 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rb7tk" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.905767 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbdsm\" (UniqueName: \"kubernetes.io/projected/04c34d21-ba2d-4418-83e2-ba162c64cc1e-kube-api-access-kbdsm\") pod \"watcher-operator-controller-manager-5dd4499c96-l9vl5\" (UID: \"04c34d21-ba2d-4418-83e2-ba162c64cc1e\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.905849 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.905927 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l2fp\" (UniqueName: \"kubernetes.io/projected/18bf5f4d-f183-41c2-b1c9-a965baab8f5d-kube-api-access-9l2fp\") pod \"test-operator-controller-manager-74665f6cdc-c6dtc\" (UID: \"18bf5f4d-f183-41c2-b1c9-a965baab8f5d\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.915398 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.918485 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.928624 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.931958 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c11fdec0-87d4-41db-b5d3-66155b578abe-cert\") pod \"infra-operator-controller-manager-585fc5b659-f7hv4\" (UID: \"c11fdec0-87d4-41db-b5d3-66155b578abe\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.933537 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.935803 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5"] Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.938951 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l2fp\" (UniqueName: \"kubernetes.io/projected/18bf5f4d-f183-41c2-b1c9-a965baab8f5d-kube-api-access-9l2fp\") pod \"test-operator-controller-manager-74665f6cdc-c6dtc\" (UID: \"18bf5f4d-f183-41c2-b1c9-a965baab8f5d\") " pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.940295 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-npqg2" Oct 08 22:04:42 crc kubenswrapper[4739]: I1008 22:04:42.979446 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.004908 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.007108 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.007252 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6j4\" (UniqueName: \"kubernetes.io/projected/effe458d-330b-4b50-9a32-bb44bc0008ca-kube-api-access-pc6j4\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.007386 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.007482 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbdsm\" (UniqueName: \"kubernetes.io/projected/04c34d21-ba2d-4418-83e2-ba162c64cc1e-kube-api-access-kbdsm\") pod \"watcher-operator-controller-manager-5dd4499c96-l9vl5\" (UID: \"04c34d21-ba2d-4418-83e2-ba162c64cc1e\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.010606 4739 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.010664 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert podName:c165e9bc-4624-4227-8a87-835cbfe8a970 nodeName:}" failed. No retries permitted until 2025-10-08 22:04:44.010649065 +0000 UTC m=+983.836034815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" (UID: "c165e9bc-4624-4227-8a87-835cbfe8a970") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.042239 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.044527 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbdsm\" (UniqueName: \"kubernetes.io/projected/04c34d21-ba2d-4418-83e2-ba162c64cc1e-kube-api-access-kbdsm\") pod \"watcher-operator-controller-manager-5dd4499c96-l9vl5\" (UID: \"04c34d21-ba2d-4418-83e2-ba162c64cc1e\") " pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.110591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4sw\" (UniqueName: \"kubernetes.io/projected/0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf-kube-api-access-gr4sw\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4msd5\" (UID: \"0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.110674 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.110720 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6j4\" (UniqueName: \"kubernetes.io/projected/effe458d-330b-4b50-9a32-bb44bc0008ca-kube-api-access-pc6j4\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.111079 4739 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.111119 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert podName:effe458d-330b-4b50-9a32-bb44bc0008ca nodeName:}" failed. No retries permitted until 2025-10-08 22:04:43.611107041 +0000 UTC m=+983.436492791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert") pod "openstack-operator-controller-manager-5cb8b8594d-rkq5g" (UID: "effe458d-330b-4b50-9a32-bb44bc0008ca") : secret "webhook-server-cert" not found Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.118073 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.131335 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6j4\" (UniqueName: \"kubernetes.io/projected/effe458d-330b-4b50-9a32-bb44bc0008ca-kube-api-access-pc6j4\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.212037 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4sw\" (UniqueName: \"kubernetes.io/projected/0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf-kube-api-access-gr4sw\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4msd5\" (UID: \"0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.230383 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4sw\" (UniqueName: \"kubernetes.io/projected/0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf-kube-api-access-gr4sw\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4msd5\" (UID: \"0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.322292 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.342534 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.343123 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8302b913_c934_4911_8c78_72d139019f33.slice/crio-a62cfb4671121284c93431726bc26040f7b16eca5458d9c67c481138d60fd68c WatchSource:0}: Error finding container a62cfb4671121284c93431726bc26040f7b16eca5458d9c67c481138d60fd68c: Status 404 returned error can't find the container with id a62cfb4671121284c93431726bc26040f7b16eca5458d9c67c481138d60fd68c Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.623606 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.629720 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/effe458d-330b-4b50-9a32-bb44bc0008ca-cert\") pod \"openstack-operator-controller-manager-5cb8b8594d-rkq5g\" (UID: \"effe458d-330b-4b50-9a32-bb44bc0008ca\") " pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.639164 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.648015 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.662932 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17ff0c7d_5595_4d2f_b77d_0f6114746fae.slice/crio-1ad26c68334e0fbb59dab5f7c247e005e69beb79cfb9917d9fd3ccaf78766888 WatchSource:0}: Error finding container 1ad26c68334e0fbb59dab5f7c247e005e69beb79cfb9917d9fd3ccaf78766888: Status 404 returned error can't find the container with id 1ad26c68334e0fbb59dab5f7c247e005e69beb79cfb9917d9fd3ccaf78766888 Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.663464 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.677859 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" event={"ID":"8302b913-c934-4911-8c78-72d139019f33","Type":"ContainerStarted","Data":"a62cfb4671121284c93431726bc26040f7b16eca5458d9c67c481138d60fd68c"} Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.686479 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" event={"ID":"17ff0c7d-5595-4d2f-b77d-0f6114746fae","Type":"ContainerStarted","Data":"1ad26c68334e0fbb59dab5f7c247e005e69beb79cfb9917d9fd3ccaf78766888"} Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.706942 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" event={"ID":"4ecad090-144b-491d-9307-dd0d2db07490","Type":"ContainerStarted","Data":"363510a60ee98e44df9d45af33414e1a4173c62d1049d6f1d1c82206f4950a0a"} Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.711069 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.721907 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" event={"ID":"5aa8c8d1-9588-4e0f-87e2-b44b072bef76","Type":"ContainerStarted","Data":"e11bb0620fd242e8016a837f563659e2788ab4c0d579821884ceb7414cd5f2e4"} Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.722513 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.722784 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae90c53_9891_4664_8767_98bfab1e021a.slice/crio-e45aeb301ea33773117a949a3483c4c4b0ebae8766da98cbc542c60888a4834d WatchSource:0}: Error finding container e45aeb301ea33773117a949a3483c4c4b0ebae8766da98cbc542c60888a4834d: Status 404 returned error can't find the container with id e45aeb301ea33773117a949a3483c4c4b0ebae8766da98cbc542c60888a4834d Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.726942 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18bf5f4d_f183_41c2_b1c9_a965baab8f5d.slice/crio-eb909759c7f5a931d03c09602fbce9e4292305709568425cdc338a57e40f3d16 WatchSource:0}: Error finding container eb909759c7f5a931d03c09602fbce9e4292305709568425cdc338a57e40f3d16: Status 404 returned error can't find the container with id eb909759c7f5a931d03c09602fbce9e4292305709568425cdc338a57e40f3d16 Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.730648 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf038b58d_e69a_481d_a0df_65211386c9da.slice/crio-1394283be05311708a5c30752a7f9bced4ffa149be6549c7dfb806ea74ebd6ad WatchSource:0}: Error finding container 1394283be05311708a5c30752a7f9bced4ffa149be6549c7dfb806ea74ebd6ad: Status 404 returned error can't find the container with id 1394283be05311708a5c30752a7f9bced4ffa149be6549c7dfb806ea74ebd6ad Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.732204 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.736004 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf67473b_9a22_492a_844d_552fc946605d.slice/crio-4f8aed8f6e13738a28d72dcfc573b844b8f98483c053223b2ca3541d450646b8 WatchSource:0}: Error finding container 4f8aed8f6e13738a28d72dcfc573b844b8f98483c053223b2ca3541d450646b8: Status 404 returned error can't find the container with id 4f8aed8f6e13738a28d72dcfc573b844b8f98483c053223b2ca3541d450646b8 Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.738731 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mwrpd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-59578bc799-l66d4_openstack-operators(cf67473b-9a22-492a-844d-552fc946605d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.738759 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.744718 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.750783 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.754575 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-l66d4"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.765505 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.865841 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.874573 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.877888 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2b6037_9b5d_47cb_b057_d33546b8e74c.slice/crio-e8093e0b4151ade429ee4ca9b4ee64d2b7b3f74f5c06d36dc05b8fb535ed8817 WatchSource:0}: Error finding container e8093e0b4151ade429ee4ca9b4ee64d2b7b3f74f5c06d36dc05b8fb535ed8817: Status 404 returned error can't find the container with id e8093e0b4151ade429ee4ca9b4ee64d2b7b3f74f5c06d36dc05b8fb535ed8817 Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.884904 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11fdec0_87d4_41db_b5d3_66155b578abe.slice/crio-36071b23013883f7af2932fcf2d8674c77e924c402d5e822c8e2e2d8e76c0cff WatchSource:0}: Error finding container 36071b23013883f7af2932fcf2d8674c77e924c402d5e822c8e2e2d8e76c0cff: Status 404 returned error can't find the container with id 36071b23013883f7af2932fcf2d8674c77e924c402d5e822c8e2e2d8e76c0cff Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.886551 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a97575_d460_410d_84aa_887e6d809bba.slice/crio-78b21ae2e09c9d60452e0eded17a4191edfa2fa6f5b93abd0ee79fd58737f6f9 WatchSource:0}: Error finding container 78b21ae2e09c9d60452e0eded17a4191edfa2fa6f5b93abd0ee79fd58737f6f9: Status 404 returned error can't find the container with id 78b21ae2e09c9d60452e0eded17a4191edfa2fa6f5b93abd0ee79fd58737f6f9 Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.887924 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4"] Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.891826 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9qwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-585fc5b659-f7hv4_openstack-operators(c11fdec0-87d4-41db-b5d3-66155b578abe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.892795 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2sq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6d7c7ddf95-wz5dg_openstack-operators(d0a97575-d460-410d-84aa-887e6d809bba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.893972 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.902530 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.903308 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41df7676_d0f5_47a1_a90c_2bc3bc01e18d.slice/crio-5875d680aed67cd6905cf5939059e6066507ae4e2e7abc8d1a6ee373a0cf53ac WatchSource:0}: Error finding container 5875d680aed67cd6905cf5939059e6066507ae4e2e7abc8d1a6ee373a0cf53ac: Status 404 returned error can't find the container with id 5875d680aed67cd6905cf5939059e6066507ae4e2e7abc8d1a6ee373a0cf53ac Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.907050 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjhnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6f96f8c84-qn2t5_openstack-operators(41df7676-d0f5-47a1-a90c-2bc3bc01e18d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.908562 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c34d21_ba2d_4418_83e2_ba162c64cc1e.slice/crio-f0590d66505f6b4f5bd2eb2ae532cc44c9a40a5d2bb606c6d0b2a46018d38f98 WatchSource:0}: Error finding container f0590d66505f6b4f5bd2eb2ae532cc44c9a40a5d2bb606c6d0b2a46018d38f98: Status 404 returned error can't find the container with id f0590d66505f6b4f5bd2eb2ae532cc44c9a40a5d2bb606c6d0b2a46018d38f98 Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.909255 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5"] Oct 08 22:04:43 crc kubenswrapper[4739]: W1008 22:04:43.912754 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod102e8f33_2000_4a71_a337_4fa304d59e93.slice/crio-345be5a6d37da3ccbc59080aab3c0e67871eda7c5cde0f41e6654b401f028381 WatchSource:0}: Error finding container 345be5a6d37da3ccbc59080aab3c0e67871eda7c5cde0f41e6654b401f028381: Status 404 returned error can't find the container with id 345be5a6d37da3ccbc59080aab3c0e67871eda7c5cde0f41e6654b401f028381 Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.914329 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5"] Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.925726 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5"] Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.926248 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbdsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5dd4499c96-l9vl5_openstack-operators(04c34d21-ba2d-4418-83e2-ba162c64cc1e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.926436 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nkcqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5777b4f897-2sv26_openstack-operators(102e8f33-2000-4a71-a337-4fa304d59e93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.926543 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.59:5001/openstack-k8s-operators/telemetry-operator:05d49bfa319c2fba99786e74abdbfde2867edf75,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7b98r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-75d7f5797c-czmc9_openstack-operators(aff0dabb-b21e-4507-8a13-1d391b8c4f52): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: I1008 22:04:43.927295 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.942018 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gr4sw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-4msd5_openstack-operators(0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:04:43 crc kubenswrapper[4739]: E1008 22:04:43.943119 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" podUID="0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.028526 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.033630 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c165e9bc-4624-4227-8a87-835cbfe8a970-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67\" (UID: \"c165e9bc-4624-4227-8a87-835cbfe8a970\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.047273 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.420781 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g"] Oct 08 22:04:44 crc kubenswrapper[4739]: W1008 22:04:44.425003 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeffe458d_330b_4b50_9a32_bb44bc0008ca.slice/crio-ffd4f558d552b684831f32e5de9d824e5b2a90f23642da32633c53070c12e1d6 WatchSource:0}: Error finding container ffd4f558d552b684831f32e5de9d824e5b2a90f23642da32633c53070c12e1d6: Status 404 returned error can't find the container with id ffd4f558d552b684831f32e5de9d824e5b2a90f23642da32633c53070c12e1d6 Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.526282 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67"] Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.656739 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" podUID="04c34d21-ba2d-4418-83e2-ba162c64cc1e" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.656848 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" podUID="c11fdec0-87d4-41db-b5d3-66155b578abe" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.656944 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" podUID="d0a97575-d460-410d-84aa-887e6d809bba" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.697948 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" podUID="cf67473b-9a22-492a-844d-552fc946605d" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.698323 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" podUID="102e8f33-2000-4a71-a337-4fa304d59e93" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.698489 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" podUID="aff0dabb-b21e-4507-8a13-1d391b8c4f52" Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.698510 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" podUID="41df7676-d0f5-47a1-a90c-2bc3bc01e18d" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.731455 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" event={"ID":"cf67473b-9a22-492a-844d-552fc946605d","Type":"ContainerStarted","Data":"861f6c6bf8262945423a6552a95a5fe770bc4cef9b02cfe794e089ced7c7d561"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.731505 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" event={"ID":"cf67473b-9a22-492a-844d-552fc946605d","Type":"ContainerStarted","Data":"4f8aed8f6e13738a28d72dcfc573b844b8f98483c053223b2ca3541d450646b8"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.732912 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" podUID="cf67473b-9a22-492a-844d-552fc946605d" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.733639 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" event={"ID":"18bf5f4d-f183-41c2-b1c9-a965baab8f5d","Type":"ContainerStarted","Data":"eb909759c7f5a931d03c09602fbce9e4292305709568425cdc338a57e40f3d16"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.743818 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" event={"ID":"aff0dabb-b21e-4507-8a13-1d391b8c4f52","Type":"ContainerStarted","Data":"3d513755acbc5514b5a5b9e97b77862409b1f11d4a9bf9992b9876edefb5c428"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.743868 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" event={"ID":"aff0dabb-b21e-4507-8a13-1d391b8c4f52","Type":"ContainerStarted","Data":"c7155e2f00606825c3b950b6d6b423e5a7a0cad0a7e58c05e66bf5b0c6cd818f"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.744887 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.59:5001/openstack-k8s-operators/telemetry-operator:05d49bfa319c2fba99786e74abdbfde2867edf75\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" podUID="aff0dabb-b21e-4507-8a13-1d391b8c4f52" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.745095 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" event={"ID":"e51cba46-23fd-4f5f-819d-c2e0ee77a743","Type":"ContainerStarted","Data":"ed5d90f53f0a3f1acc434441f1771b4155d47f8a21b0891ead26c1f77b759a87"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.746120 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" event={"ID":"dd2b6037-9b5d-47cb-b057-d33546b8e74c","Type":"ContainerStarted","Data":"e8093e0b4151ade429ee4ca9b4ee64d2b7b3f74f5c06d36dc05b8fb535ed8817"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.746825 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" event={"ID":"c165e9bc-4624-4227-8a87-835cbfe8a970","Type":"ContainerStarted","Data":"1f2c62fe96216450c47ad709c75a91c81e52fe43c41149244a807c05df9c30ee"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.748031 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" event={"ID":"102e8f33-2000-4a71-a337-4fa304d59e93","Type":"ContainerStarted","Data":"56ab6ef3089bc7061b700220a28d3c804672b6e373a1c086b162318fad7c45d6"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.748078 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" event={"ID":"102e8f33-2000-4a71-a337-4fa304d59e93","Type":"ContainerStarted","Data":"345be5a6d37da3ccbc59080aab3c0e67871eda7c5cde0f41e6654b401f028381"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.748938 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" podUID="102e8f33-2000-4a71-a337-4fa304d59e93" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.749539 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" event={"ID":"f038b58d-e69a-481d-a0df-65211386c9da","Type":"ContainerStarted","Data":"1394283be05311708a5c30752a7f9bced4ffa149be6549c7dfb806ea74ebd6ad"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.752926 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" event={"ID":"7e483c3d-debb-4f41-a968-0d19d337e771","Type":"ContainerStarted","Data":"9eada1975e3ce5e4f6fdbad341250634a45abb63bc5cf10b6209932bfb2c8b5d"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.754858 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" event={"ID":"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d","Type":"ContainerStarted","Data":"7104ec6b940e94f2b17ec2777c4bcfb54dce8ad7e22d91594cc59f9fb4fafbea"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.758071 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" event={"ID":"d0a97575-d460-410d-84aa-887e6d809bba","Type":"ContainerStarted","Data":"12e095c4516eb21bc448318976c0001dca253df97f4f22b0e7f9e1ef3fc4fb2a"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.758116 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" event={"ID":"d0a97575-d460-410d-84aa-887e6d809bba","Type":"ContainerStarted","Data":"78b21ae2e09c9d60452e0eded17a4191edfa2fa6f5b93abd0ee79fd58737f6f9"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.761305 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" podUID="d0a97575-d460-410d-84aa-887e6d809bba" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.765620 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" event={"ID":"04c34d21-ba2d-4418-83e2-ba162c64cc1e","Type":"ContainerStarted","Data":"3a2ea74a886786a4df47c438f63185bac43a3817c14e39e4d8c08a9835ec9cc7"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.765652 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" event={"ID":"04c34d21-ba2d-4418-83e2-ba162c64cc1e","Type":"ContainerStarted","Data":"f0590d66505f6b4f5bd2eb2ae532cc44c9a40a5d2bb606c6d0b2a46018d38f98"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.768584 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" podUID="04c34d21-ba2d-4418-83e2-ba162c64cc1e" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.775480 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" event={"ID":"c11fdec0-87d4-41db-b5d3-66155b578abe","Type":"ContainerStarted","Data":"0ec73012fdf901343af41d5e66c7457cfe04b635c5b82557ccf59c12b28d470b"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.775547 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" event={"ID":"c11fdec0-87d4-41db-b5d3-66155b578abe","Type":"ContainerStarted","Data":"36071b23013883f7af2932fcf2d8674c77e924c402d5e822c8e2e2d8e76c0cff"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.778453 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" podUID="c11fdec0-87d4-41db-b5d3-66155b578abe" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.779636 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" event={"ID":"effe458d-330b-4b50-9a32-bb44bc0008ca","Type":"ContainerStarted","Data":"0ddcd2804fc5de4c187668e465612aed126d25ae722da0ee3a6fea85f45ae3da"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.779674 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" event={"ID":"effe458d-330b-4b50-9a32-bb44bc0008ca","Type":"ContainerStarted","Data":"ffd4f558d552b684831f32e5de9d824e5b2a90f23642da32633c53070c12e1d6"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.798634 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" event={"ID":"0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf","Type":"ContainerStarted","Data":"9a84d2e0129fbafe486faece23591b0dd09540b2cc80781335ef2569fc5d63c5"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.818564 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" podUID="0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.842788 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" event={"ID":"fae90c53-9891-4664-8767-98bfab1e021a","Type":"ContainerStarted","Data":"e45aeb301ea33773117a949a3483c4c4b0ebae8766da98cbc542c60888a4834d"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.874307 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" event={"ID":"41df7676-d0f5-47a1-a90c-2bc3bc01e18d","Type":"ContainerStarted","Data":"78fb8cd639638e55276b82794f35048c2a44d2f90581c71f8ea0f1b0a8f23121"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.874354 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" event={"ID":"41df7676-d0f5-47a1-a90c-2bc3bc01e18d","Type":"ContainerStarted","Data":"5875d680aed67cd6905cf5939059e6066507ae4e2e7abc8d1a6ee373a0cf53ac"} Oct 08 22:04:44 crc kubenswrapper[4739]: E1008 22:04:44.880662 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" podUID="41df7676-d0f5-47a1-a90c-2bc3bc01e18d" Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.901503 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" event={"ID":"23d030c7-6a61-4ba5-9b00-018f7370ea5d","Type":"ContainerStarted","Data":"a1b27a24d35108adf974a1fb5dfdd0b8588a2b0a18e89eaa3ea7dfec03547ce7"} Oct 08 22:04:44 crc kubenswrapper[4739]: I1008 22:04:44.927373 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" event={"ID":"dcd83df0-3381-4d08-9818-e7f91ba6f77b","Type":"ContainerStarted","Data":"ce9c2c2a87b26fa5667d43655b1b7aeda29ab1adb8a84e3b6c06522e41ec11d8"} Oct 08 22:04:45 crc kubenswrapper[4739]: I1008 22:04:45.941386 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" event={"ID":"effe458d-330b-4b50-9a32-bb44bc0008ca","Type":"ContainerStarted","Data":"cfdd024ac1d6a280b7af73f9879284dcf7d6a56312517afd238a73a262027ce9"} Oct 08 22:04:45 crc kubenswrapper[4739]: I1008 22:04:45.941775 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944422 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" podUID="102e8f33-2000-4a71-a337-4fa304d59e93" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944433 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" podUID="0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944695 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.59:5001/openstack-k8s-operators/telemetry-operator:05d49bfa319c2fba99786e74abdbfde2867edf75\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" podUID="aff0dabb-b21e-4507-8a13-1d391b8c4f52" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944699 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:e4ae07e859166fc5e2cb4f8e0e2c3358b9d2e2d6721a3864d2e0c651d36698ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" podUID="04c34d21-ba2d-4418-83e2-ba162c64cc1e" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944743 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" podUID="cf67473b-9a22-492a-844d-552fc946605d" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.944749 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" podUID="d0a97575-d460-410d-84aa-887e6d809bba" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.945046 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" podUID="c11fdec0-87d4-41db-b5d3-66155b578abe" Oct 08 22:04:45 crc kubenswrapper[4739]: E1008 22:04:45.945255 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" podUID="41df7676-d0f5-47a1-a90c-2bc3bc01e18d" Oct 08 22:04:46 crc kubenswrapper[4739]: I1008 22:04:46.050607 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" podStartSLOduration=4.050591094 podStartE2EDuration="4.050591094s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:04:46.031170236 +0000 UTC m=+985.856555986" watchObservedRunningTime="2025-10-08 22:04:46.050591094 +0000 UTC m=+985.875976834" Oct 08 22:04:51 crc kubenswrapper[4739]: I1008 22:04:51.773312 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:04:51 crc kubenswrapper[4739]: I1008 22:04:51.773715 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:04:53 crc kubenswrapper[4739]: I1008 22:04:53.937104 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5cb8b8594d-rkq5g" Oct 08 22:04:57 crc kubenswrapper[4739]: E1008 22:04:57.163202 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9" Oct 08 22:04:57 crc kubenswrapper[4739]: E1008 22:04:57.163778 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:efa8fb78cffb573d299ffcc7bab1099affd2dbbab222152092b313074306e0a9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9l2fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-74665f6cdc-c6dtc_openstack-operators(18bf5f4d-f183-41c2-b1c9-a965baab8f5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:05:08 crc kubenswrapper[4739]: E1008 22:05:08.314331 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34" Oct 08 22:05:08 crc kubenswrapper[4739]: E1008 22:05:08.315621 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zg9tj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-687df44cdb-qhqv6_openstack-operators(17ff0c7d-5595-4d2f-b77d-0f6114746fae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:05:09 crc kubenswrapper[4739]: E1008 22:05:09.972060 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0" Oct 08 22:05:09 crc kubenswrapper[4739]: E1008 22:05:09.972864 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kzlqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-7bb46cd7d-wqmv7_openstack-operators(8302b913-c934-4911-8c78-72d139019f33): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:05:10 crc kubenswrapper[4739]: E1008 22:05:10.437091 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e" Oct 08 22:05:10 crc kubenswrapper[4739]: E1008 22:05:10.437274 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pnfb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-7gjhx_openstack-operators(dd2b6037-9b5d-47cb-b057-d33546b8e74c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.185561 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.185708 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-scnpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-59cdc64769-zk8hf_openstack-operators(5aa8c8d1-9588-4e0f-87e2-b44b072bef76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.931761 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" podUID="18bf5f4d-f183-41c2-b1c9-a965baab8f5d" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.957805 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" podUID="8302b913-c934-4911-8c78-72d139019f33" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.958030 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" podUID="5aa8c8d1-9588-4e0f-87e2-b44b072bef76" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.958217 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" podUID="17ff0c7d-5595-4d2f-b77d-0f6114746fae" Oct 08 22:05:11 crc kubenswrapper[4739]: E1008 22:05:11.958264 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" podUID="dd2b6037-9b5d-47cb-b057-d33546b8e74c" Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.216262 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" event={"ID":"17ff0c7d-5595-4d2f-b77d-0f6114746fae","Type":"ContainerStarted","Data":"d59e3b83658d40f1b164c54ba991725f9732550d50e3956c66409a8b49779655"} Oct 08 22:05:12 crc kubenswrapper[4739]: E1008 22:05:12.217788 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34\\\"\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" podUID="17ff0c7d-5595-4d2f-b77d-0f6114746fae" Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.218689 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" event={"ID":"dd2b6037-9b5d-47cb-b057-d33546b8e74c","Type":"ContainerStarted","Data":"a554337232a04c5100f51ee41cf84508c344f08b66ff3fc1c62d1ac224cb641b"} Oct 08 22:05:12 crc kubenswrapper[4739]: E1008 22:05:12.220338 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" podUID="dd2b6037-9b5d-47cb-b057-d33546b8e74c" Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.220980 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" event={"ID":"8302b913-c934-4911-8c78-72d139019f33","Type":"ContainerStarted","Data":"4afbc71c98cb0c5d81cf943be60177db96be801de3da23dec061865b45956798"} Oct 08 22:05:12 crc kubenswrapper[4739]: E1008 22:05:12.221796 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" podUID="8302b913-c934-4911-8c78-72d139019f33" Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.222442 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" event={"ID":"18bf5f4d-f183-41c2-b1c9-a965baab8f5d","Type":"ContainerStarted","Data":"be3b6034a204a629714282e5d571ed533d73a6ddc8b2e2ee66d3364087213103"} Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.223615 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" event={"ID":"5aa8c8d1-9588-4e0f-87e2-b44b072bef76","Type":"ContainerStarted","Data":"5bda6b4b12e1470bdcd37a03e39a9238403e16758cf7a355f06d3696724b7218"} Oct 08 22:05:12 crc kubenswrapper[4739]: E1008 22:05:12.224503 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" podUID="5aa8c8d1-9588-4e0f-87e2-b44b072bef76" Oct 08 22:05:12 crc kubenswrapper[4739]: I1008 22:05:12.224802 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" event={"ID":"fae90c53-9891-4664-8767-98bfab1e021a","Type":"ContainerStarted","Data":"ebe06ea7c2115f394f18eec1a29eb77d5d9b4081c9b5f8f878372636f8129a45"} Oct 08 22:05:13 crc kubenswrapper[4739]: E1008 22:05:13.232959 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" podUID="dd2b6037-9b5d-47cb-b057-d33546b8e74c" Oct 08 22:05:13 crc kubenswrapper[4739]: E1008 22:05:13.232986 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" podUID="5aa8c8d1-9588-4e0f-87e2-b44b072bef76" Oct 08 22:05:13 crc kubenswrapper[4739]: E1008 22:05:13.234103 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" podUID="8302b913-c934-4911-8c78-72d139019f33" Oct 08 22:05:13 crc kubenswrapper[4739]: E1008 22:05:13.237872 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34\\\"\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" podUID="17ff0c7d-5595-4d2f-b77d-0f6114746fae" Oct 08 22:05:17 crc kubenswrapper[4739]: I1008 22:05:17.262660 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" event={"ID":"4ecad090-144b-491d-9307-dd0d2db07490","Type":"ContainerStarted","Data":"20c05fc2531f64a006cd6ebb2099cb2462c1b61a9eb8282be602e8b077fccec4"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.317782 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" event={"ID":"4ecad090-144b-491d-9307-dd0d2db07490","Type":"ContainerStarted","Data":"7807a8aaab79fda7064e421118eb2d797a6d53dc7a0aa597908d35a7df931475"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.318440 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.327454 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" event={"ID":"23d030c7-6a61-4ba5-9b00-018f7370ea5d","Type":"ContainerStarted","Data":"2abb9dcb2b604b593093681d06e10e10bebb5e38406f821ad552387f3b37d66a"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.341765 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" event={"ID":"dcd83df0-3381-4d08-9818-e7f91ba6f77b","Type":"ContainerStarted","Data":"b5682870b9e5956e17d2637bb1140ef0173fdc2e1ebb30ac0752f089482185f5"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.341808 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" event={"ID":"dcd83df0-3381-4d08-9818-e7f91ba6f77b","Type":"ContainerStarted","Data":"814d5e7acc0ab5114134d55f22715a781370a81e9026f6e220bf7b7b4b8603ab"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.342558 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.357791 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" event={"ID":"cf67473b-9a22-492a-844d-552fc946605d","Type":"ContainerStarted","Data":"7009c27de545527ef2fc646b9d15d4686ace5df69ffd68da35b5f2e8cfb825e9"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.358552 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.361055 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" podStartSLOduration=11.773657888 podStartE2EDuration="40.361033235s" podCreationTimestamp="2025-10-08 22:04:41 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.054573128 +0000 UTC m=+982.879958878" lastFinishedPulling="2025-10-08 22:05:11.641948475 +0000 UTC m=+1011.467334225" observedRunningTime="2025-10-08 22:05:21.343097032 +0000 UTC m=+1021.168482782" watchObservedRunningTime="2025-10-08 22:05:21.361033235 +0000 UTC m=+1021.186418985" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.368497 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" podStartSLOduration=11.39931121 podStartE2EDuration="39.368473338s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.706650132 +0000 UTC m=+983.532035882" lastFinishedPulling="2025-10-08 22:05:11.67581226 +0000 UTC m=+1011.501198010" observedRunningTime="2025-10-08 22:05:21.365603797 +0000 UTC m=+1021.190989547" watchObservedRunningTime="2025-10-08 22:05:21.368473338 +0000 UTC m=+1021.193859098" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.369782 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" event={"ID":"e51cba46-23fd-4f5f-819d-c2e0ee77a743","Type":"ContainerStarted","Data":"dff4e9eb63a15ccfc8ecfc4d6adb1c700445462f2440fe2b2a33d9a34bdc7cbf"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.388549 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" podStartSLOduration=2.457312157 podStartE2EDuration="39.388532132s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.73861933 +0000 UTC m=+983.564005080" lastFinishedPulling="2025-10-08 22:05:20.669839305 +0000 UTC m=+1020.495225055" observedRunningTime="2025-10-08 22:05:21.387330082 +0000 UTC m=+1021.212715832" watchObservedRunningTime="2025-10-08 22:05:21.388532132 +0000 UTC m=+1021.213917882" Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.390454 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" event={"ID":"c165e9bc-4624-4227-8a87-835cbfe8a970","Type":"ContainerStarted","Data":"32b872bc997455a901d687c072f2e9d52d4364364fc83c206b69b22b8a1ae3b7"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.401982 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" event={"ID":"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d","Type":"ContainerStarted","Data":"739957191e037e8ff4505bb5bc08bbd0c5bd8037b010314918ed0206a1951a12"} Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.770523 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:05:21 crc kubenswrapper[4739]: I1008 22:05:21.770582 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.286211 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-2c9sc" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.464264 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" event={"ID":"04c34d21-ba2d-4418-83e2-ba162c64cc1e","Type":"ContainerStarted","Data":"f43b6dcc0860d2acec8b743f33289f51f4c54f1f73eade7033ff2f2f4f6719f0"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.464452 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.467053 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" event={"ID":"aff0dabb-b21e-4507-8a13-1d391b8c4f52","Type":"ContainerStarted","Data":"81376f39ea762358358a8397088ab3466c641dfcf64d5c1f2e338a07a0cf01f8"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.467298 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.468575 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" event={"ID":"7e483c3d-debb-4f41-a968-0d19d337e771","Type":"ContainerStarted","Data":"f740ceb2c154b374fa5db68d9258df12b6838aa2098d5f75bebac77817ab4946"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.468605 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" event={"ID":"7e483c3d-debb-4f41-a968-0d19d337e771","Type":"ContainerStarted","Data":"2952491818c934c701184c7c23706bcda9089f46bb89a6bd689f4e652ea06fa7"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.468715 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.470093 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" event={"ID":"41df7676-d0f5-47a1-a90c-2bc3bc01e18d","Type":"ContainerStarted","Data":"46279044ac11bc74faa65426cd04d7d5259fc6f486048e7e0e265d40b5e86f75"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.470283 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.471490 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" event={"ID":"d0a97575-d460-410d-84aa-887e6d809bba","Type":"ContainerStarted","Data":"b1e103266b041889aaec0a1df3cd6debd2f62f586176fc7c492df9dfd4b88e39"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.471703 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.473154 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" event={"ID":"102e8f33-2000-4a71-a337-4fa304d59e93","Type":"ContainerStarted","Data":"c3aba392ecebcf775b3eda4e0b3a5d19a296c3ff1800d93247025037a2b83972"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.473324 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.475247 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" event={"ID":"58ceaa51-6704-4f1d-8aa0-2053f1c7c89d","Type":"ContainerStarted","Data":"90bfd681a3c6acdf030b64758e1c4036f507f120c11b17a5dda8558795d660b7"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.475381 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.476553 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" event={"ID":"23d030c7-6a61-4ba5-9b00-018f7370ea5d","Type":"ContainerStarted","Data":"65025d6c2891b27f0cd049aaf2a5d50a0d63ee1a5fa40833fa94cfe6753e7901"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.476651 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.478339 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" event={"ID":"0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf","Type":"ContainerStarted","Data":"ce1d970a04e5595a9491af676a1c539a2b5ea1c0432e0fc34c22cf24909e20aa"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.479863 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" event={"ID":"f038b58d-e69a-481d-a0df-65211386c9da","Type":"ContainerStarted","Data":"ca61a965d8aee542136045ea823e22279502b98518f7c7bec57af1980356978f"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.479887 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" event={"ID":"f038b58d-e69a-481d-a0df-65211386c9da","Type":"ContainerStarted","Data":"e727552fade3d102fde9be3b5046d3fa1df35737e39048bf287d780988dd69f2"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.479991 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.481623 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" event={"ID":"18bf5f4d-f183-41c2-b1c9-a965baab8f5d","Type":"ContainerStarted","Data":"e5e9e62a6e83865284cd424bbebbbcc2fb4bd20e00a061f972580669879fe77b"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.481760 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.484052 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" event={"ID":"c11fdec0-87d4-41db-b5d3-66155b578abe","Type":"ContainerStarted","Data":"5f7140822ecc515d18786e95e53706f4763e9e8ff35fba3aa3a9c28db6be81f3"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.484231 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.504580 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" event={"ID":"fae90c53-9891-4664-8767-98bfab1e021a","Type":"ContainerStarted","Data":"4551853372e1129f472e31c0d83ed3c88a03e47f24a9d18fb5dbd9789532b00b"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.504887 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.507554 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.507925 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" event={"ID":"e51cba46-23fd-4f5f-819d-c2e0ee77a743","Type":"ContainerStarted","Data":"5491cc070c66be700d126d5221eb355f5b88bc4967c56ac8cf0fe6df092bb9b2"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.508388 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.513776 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" event={"ID":"c165e9bc-4624-4227-8a87-835cbfe8a970","Type":"ContainerStarted","Data":"30f8bd05dbaaec3168ab97a103b570ea4f7511b22fd93767a427339ae9929216"} Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.513837 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.516068 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" podStartSLOduration=12.589151012 podStartE2EDuration="40.516050777s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.732258154 +0000 UTC m=+983.557643894" lastFinishedPulling="2025-10-08 22:05:11.659157909 +0000 UTC m=+1011.484543659" observedRunningTime="2025-10-08 22:05:22.51047538 +0000 UTC m=+1022.335861120" watchObservedRunningTime="2025-10-08 22:05:22.516050777 +0000 UTC m=+1022.341436527" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.519312 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" podStartSLOduration=3.773083383 podStartE2EDuration="40.519303007s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.926045651 +0000 UTC m=+983.751431401" lastFinishedPulling="2025-10-08 22:05:20.672265275 +0000 UTC m=+1020.497651025" observedRunningTime="2025-10-08 22:05:22.483938646 +0000 UTC m=+1022.309324406" watchObservedRunningTime="2025-10-08 22:05:22.519303007 +0000 UTC m=+1022.344688757" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.528209 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" podStartSLOduration=12.557189754 podStartE2EDuration="40.528191396s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.706345155 +0000 UTC m=+983.531730905" lastFinishedPulling="2025-10-08 22:05:11.677346797 +0000 UTC m=+1011.502732547" observedRunningTime="2025-10-08 22:05:22.523276585 +0000 UTC m=+1022.348662335" watchObservedRunningTime="2025-10-08 22:05:22.528191396 +0000 UTC m=+1022.353577146" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.544877 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" podStartSLOduration=3.510172941 podStartE2EDuration="40.544862837s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.731389482 +0000 UTC m=+983.556775232" lastFinishedPulling="2025-10-08 22:05:20.766079378 +0000 UTC m=+1020.591465128" observedRunningTime="2025-10-08 22:05:22.543113874 +0000 UTC m=+1022.368499624" watchObservedRunningTime="2025-10-08 22:05:22.544862837 +0000 UTC m=+1022.370248587" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.561439 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" podStartSLOduration=3.7818869 podStartE2EDuration="40.561423546s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.892701779 +0000 UTC m=+983.718087529" lastFinishedPulling="2025-10-08 22:05:20.672238375 +0000 UTC m=+1020.497624175" observedRunningTime="2025-10-08 22:05:22.557737374 +0000 UTC m=+1022.383123134" watchObservedRunningTime="2025-10-08 22:05:22.561423546 +0000 UTC m=+1022.386809296" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.570642 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" podStartSLOduration=12.596311548 podStartE2EDuration="40.570624453s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.702381487 +0000 UTC m=+983.527767237" lastFinishedPulling="2025-10-08 22:05:11.676694392 +0000 UTC m=+1011.502080142" observedRunningTime="2025-10-08 22:05:22.569570737 +0000 UTC m=+1022.394956487" watchObservedRunningTime="2025-10-08 22:05:22.570624453 +0000 UTC m=+1022.396010203" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.588228 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" podStartSLOduration=12.634872798 podStartE2EDuration="40.588208556s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.706486998 +0000 UTC m=+983.531872748" lastFinishedPulling="2025-10-08 22:05:11.659822756 +0000 UTC m=+1011.485208506" observedRunningTime="2025-10-08 22:05:22.585363246 +0000 UTC m=+1022.410748996" watchObservedRunningTime="2025-10-08 22:05:22.588208556 +0000 UTC m=+1022.413594306" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.604486 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4msd5" podStartSLOduration=3.854301715 podStartE2EDuration="40.604472437s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.941899071 +0000 UTC m=+983.767284821" lastFinishedPulling="2025-10-08 22:05:20.692069793 +0000 UTC m=+1020.517455543" observedRunningTime="2025-10-08 22:05:22.600711204 +0000 UTC m=+1022.426096954" watchObservedRunningTime="2025-10-08 22:05:22.604472437 +0000 UTC m=+1022.429858177" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.632772 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" podStartSLOduration=3.887977575 podStartE2EDuration="40.632757484s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.926358938 +0000 UTC m=+983.751744688" lastFinishedPulling="2025-10-08 22:05:20.671138837 +0000 UTC m=+1020.496524597" observedRunningTime="2025-10-08 22:05:22.629899454 +0000 UTC m=+1022.455285214" watchObservedRunningTime="2025-10-08 22:05:22.632757484 +0000 UTC m=+1022.458143234" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.685179 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" podStartSLOduration=3.801605326 podStartE2EDuration="40.685162256s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.926458291 +0000 UTC m=+983.751844051" lastFinishedPulling="2025-10-08 22:05:20.810015231 +0000 UTC m=+1020.635400981" observedRunningTime="2025-10-08 22:05:22.652935451 +0000 UTC m=+1022.478321201" watchObservedRunningTime="2025-10-08 22:05:22.685162256 +0000 UTC m=+1022.510548006" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.685806 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" podStartSLOduration=3.921695556 podStartE2EDuration="40.685800632s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.906893158 +0000 UTC m=+983.732278908" lastFinishedPulling="2025-10-08 22:05:20.670998214 +0000 UTC m=+1020.496383984" observedRunningTime="2025-10-08 22:05:22.682593522 +0000 UTC m=+1022.507979272" watchObservedRunningTime="2025-10-08 22:05:22.685800632 +0000 UTC m=+1022.511186382" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.725994 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" podStartSLOduration=13.613106063 podStartE2EDuration="40.725975811s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:44.54625858 +0000 UTC m=+984.371644330" lastFinishedPulling="2025-10-08 22:05:11.659128328 +0000 UTC m=+1011.484514078" observedRunningTime="2025-10-08 22:05:22.720658501 +0000 UTC m=+1022.546044251" watchObservedRunningTime="2025-10-08 22:05:22.725975811 +0000 UTC m=+1022.551361561" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.741953 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" podStartSLOduration=3.95994489 podStartE2EDuration="40.741938356s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.891685804 +0000 UTC m=+983.717071554" lastFinishedPulling="2025-10-08 22:05:20.67367926 +0000 UTC m=+1020.499065020" observedRunningTime="2025-10-08 22:05:22.736536052 +0000 UTC m=+1022.561921802" watchObservedRunningTime="2025-10-08 22:05:22.741938356 +0000 UTC m=+1022.567324106" Oct 08 22:05:22 crc kubenswrapper[4739]: I1008 22:05:22.755073 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-xnfkp" podStartSLOduration=13.816554557 podStartE2EDuration="41.755058949s" podCreationTimestamp="2025-10-08 22:04:41 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.724950203 +0000 UTC m=+983.550335953" lastFinishedPulling="2025-10-08 22:05:11.663454595 +0000 UTC m=+1011.488840345" observedRunningTime="2025-10-08 22:05:22.750292532 +0000 UTC m=+1022.575678282" watchObservedRunningTime="2025-10-08 22:05:22.755058949 +0000 UTC m=+1022.580444699" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.577958 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" event={"ID":"8302b913-c934-4911-8c78-72d139019f33","Type":"ContainerStarted","Data":"27f89bd2e658603cab93df169c3d55af39e88852f92e1a88d732fc3916fee524"} Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.580455 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" event={"ID":"5aa8c8d1-9588-4e0f-87e2-b44b072bef76","Type":"ContainerStarted","Data":"08e24fbaab9be2eef3b143df85420796a1de3aa550e0d8320851de5bfe1845ea"} Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.580727 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.582468 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" event={"ID":"17ff0c7d-5595-4d2f-b77d-0f6114746fae","Type":"ContainerStarted","Data":"2de8eea692625bbc63a58dcb977cc0e32a613c6094d56accd975795400f1ee90"} Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.582703 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.584600 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" event={"ID":"dd2b6037-9b5d-47cb-b057-d33546b8e74c","Type":"ContainerStarted","Data":"e810fde5795900371fc4456483e25f134cb4d0cb0ecde446222acbdd66b2b1da"} Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.584808 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.607863 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" podStartSLOduration=3.47930553 podStartE2EDuration="48.607841681s" podCreationTimestamp="2025-10-08 22:04:41 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.351141688 +0000 UTC m=+983.176527438" lastFinishedPulling="2025-10-08 22:05:28.479677789 +0000 UTC m=+1028.305063589" observedRunningTime="2025-10-08 22:05:29.600521129 +0000 UTC m=+1029.425906889" watchObservedRunningTime="2025-10-08 22:05:29.607841681 +0000 UTC m=+1029.433227441" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.607983 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" podStartSLOduration=19.656857001 podStartE2EDuration="47.607978094s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.724717417 +0000 UTC m=+983.550103167" lastFinishedPulling="2025-10-08 22:05:11.67583851 +0000 UTC m=+1011.501224260" observedRunningTime="2025-10-08 22:05:22.773330199 +0000 UTC m=+1022.598715949" watchObservedRunningTime="2025-10-08 22:05:29.607978094 +0000 UTC m=+1029.433363864" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.621681 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" podStartSLOduration=2.833067879 podStartE2EDuration="47.6216642s" podCreationTimestamp="2025-10-08 22:04:42 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.881298558 +0000 UTC m=+983.706684308" lastFinishedPulling="2025-10-08 22:05:28.669894879 +0000 UTC m=+1028.495280629" observedRunningTime="2025-10-08 22:05:29.619108508 +0000 UTC m=+1029.444494268" watchObservedRunningTime="2025-10-08 22:05:29.6216642 +0000 UTC m=+1029.447049960" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.641088 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" podStartSLOduration=3.832048095 podStartE2EDuration="48.641070839s" podCreationTimestamp="2025-10-08 22:04:41 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.671313581 +0000 UTC m=+983.496699331" lastFinishedPulling="2025-10-08 22:05:28.480336315 +0000 UTC m=+1028.305722075" observedRunningTime="2025-10-08 22:05:29.635091432 +0000 UTC m=+1029.460477202" watchObservedRunningTime="2025-10-08 22:05:29.641070839 +0000 UTC m=+1029.466456589" Oct 08 22:05:29 crc kubenswrapper[4739]: I1008 22:05:29.662575 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" podStartSLOduration=3.829603325 podStartE2EDuration="48.662555629s" podCreationTimestamp="2025-10-08 22:04:41 +0000 UTC" firstStartedPulling="2025-10-08 22:04:43.647382461 +0000 UTC m=+983.472768211" lastFinishedPulling="2025-10-08 22:05:28.480334725 +0000 UTC m=+1028.305720515" observedRunningTime="2025-10-08 22:05:29.656324295 +0000 UTC m=+1029.481710055" watchObservedRunningTime="2025-10-08 22:05:29.662555629 +0000 UTC m=+1029.487941389" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.327458 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.472067 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-9ls4w" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.482859 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-l66d4" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.657716 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-fbhjp" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.705363 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-8djd9" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.761678 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-8d2bb" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.761745 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-6kp4x" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.814126 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-wz5dg" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.835921 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-qn2t5" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.865576 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-2sv26" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.869901 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9vz2h" Oct 08 22:05:32 crc kubenswrapper[4739]: I1008 22:05:32.923761 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-75d7f5797c-czmc9" Oct 08 22:05:33 crc kubenswrapper[4739]: I1008 22:05:33.008282 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-74665f6cdc-c6dtc" Oct 08 22:05:33 crc kubenswrapper[4739]: I1008 22:05:33.048598 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-f7hv4" Oct 08 22:05:33 crc kubenswrapper[4739]: I1008 22:05:33.122397 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5dd4499c96-l9vl5" Oct 08 22:05:34 crc kubenswrapper[4739]: I1008 22:05:34.054554 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67" Oct 08 22:05:42 crc kubenswrapper[4739]: I1008 22:05:42.329996 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wqmv7" Oct 08 22:05:42 crc kubenswrapper[4739]: I1008 22:05:42.593004 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-zk8hf" Oct 08 22:05:42 crc kubenswrapper[4739]: I1008 22:05:42.601687 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-qhqv6" Oct 08 22:05:42 crc kubenswrapper[4739]: I1008 22:05:42.888692 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-7gjhx" Oct 08 22:05:51 crc kubenswrapper[4739]: I1008 22:05:51.766915 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:05:51 crc kubenswrapper[4739]: I1008 22:05:51.768228 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:05:51 crc kubenswrapper[4739]: I1008 22:05:51.768300 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:05:51 crc kubenswrapper[4739]: I1008 22:05:51.769585 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:05:51 crc kubenswrapper[4739]: I1008 22:05:51.769696 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84" gracePeriod=600 Oct 08 22:05:52 crc kubenswrapper[4739]: I1008 22:05:52.818736 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84" exitCode=0 Oct 08 22:05:52 crc kubenswrapper[4739]: I1008 22:05:52.818859 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84"} Oct 08 22:05:52 crc kubenswrapper[4739]: I1008 22:05:52.819212 4739 scope.go:117] "RemoveContainer" containerID="c9993989722d5a6736d9a76651861a3541ac4d181be8e64c84d138a4526b99c8" Oct 08 22:05:53 crc kubenswrapper[4739]: I1008 22:05:53.842709 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9"} Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.734166 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.736308 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.741025 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.741123 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.741336 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.744684 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.744978 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lb2mr" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.799016 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.800841 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.803565 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.810776 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.894379 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.894478 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lt6\" (UniqueName: \"kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.894532 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.894586 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqtxs\" (UniqueName: \"kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.894661 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.995852 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lt6\" (UniqueName: \"kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.995915 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.995962 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqtxs\" (UniqueName: \"kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.996004 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.996207 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.997102 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.997258 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:04 crc kubenswrapper[4739]: I1008 22:06:04.997321 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.027478 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqtxs\" (UniqueName: \"kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs\") pod \"dnsmasq-dns-78dd6ddcc-lrvj7\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.029773 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lt6\" (UniqueName: \"kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6\") pod \"dnsmasq-dns-675f4bcbfc-nw242\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.053330 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.162699 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.483845 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:05 crc kubenswrapper[4739]: W1008 22:06:05.487506 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd9d9636_02eb_4252_a3d2_0d4e4ecea80d.slice/crio-e5533674163f53981182322c07e80641ea7230b37301a67c9deadc5604ae4100 WatchSource:0}: Error finding container e5533674163f53981182322c07e80641ea7230b37301a67c9deadc5604ae4100: Status 404 returned error can't find the container with id e5533674163f53981182322c07e80641ea7230b37301a67c9deadc5604ae4100 Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.597717 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:05 crc kubenswrapper[4739]: W1008 22:06:05.602086 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf18cfb5c_4d80_494f_aeed_a923bf9b49f3.slice/crio-130d4d8e3728dc1a0706972815ae15739be46cc79b27bc5b38f66c9b8ed60c36 WatchSource:0}: Error finding container 130d4d8e3728dc1a0706972815ae15739be46cc79b27bc5b38f66c9b8ed60c36: Status 404 returned error can't find the container with id 130d4d8e3728dc1a0706972815ae15739be46cc79b27bc5b38f66c9b8ed60c36 Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.940695 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" event={"ID":"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d","Type":"ContainerStarted","Data":"e5533674163f53981182322c07e80641ea7230b37301a67c9deadc5604ae4100"} Oct 08 22:06:05 crc kubenswrapper[4739]: I1008 22:06:05.942331 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" event={"ID":"f18cfb5c-4d80-494f-aeed-a923bf9b49f3","Type":"ContainerStarted","Data":"130d4d8e3728dc1a0706972815ae15739be46cc79b27bc5b38f66c9b8ed60c36"} Oct 08 22:06:07 crc kubenswrapper[4739]: I1008 22:06:07.875914 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:07 crc kubenswrapper[4739]: I1008 22:06:07.922333 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:07 crc kubenswrapper[4739]: I1008 22:06:07.923522 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:07 crc kubenswrapper[4739]: I1008 22:06:07.943854 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.042455 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.042495 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.042558 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8th\" (UniqueName: \"kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.143432 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.143472 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.143524 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8th\" (UniqueName: \"kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.144443 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.144482 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.180237 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8th\" (UniqueName: \"kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th\") pod \"dnsmasq-dns-666b6646f7-p5rwg\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.213876 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.237224 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.239719 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.252205 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.252893 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.347167 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bkg\" (UniqueName: \"kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.347237 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.347274 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.454861 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bkg\" (UniqueName: \"kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.455164 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.455195 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.456036 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.464337 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.498778 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bkg\" (UniqueName: \"kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg\") pod \"dnsmasq-dns-57d769cc4f-tknt8\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.573084 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.809396 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:08 crc kubenswrapper[4739]: W1008 22:06:08.830277 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde90f9c1_7173_4878_b0c5_6b8734353119.slice/crio-8eb0afdbd469cc4329747b6116003ed7593943692706cdaf7b4efaa38d9126da WatchSource:0}: Error finding container 8eb0afdbd469cc4329747b6116003ed7593943692706cdaf7b4efaa38d9126da: Status 404 returned error can't find the container with id 8eb0afdbd469cc4329747b6116003ed7593943692706cdaf7b4efaa38d9126da Oct 08 22:06:08 crc kubenswrapper[4739]: I1008 22:06:08.857677 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.005643 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" event={"ID":"56d84fd1-be79-439a-a63e-349430395229","Type":"ContainerStarted","Data":"c4756d0264b1c2a2d903a746ad335a2379c72ed3204f7c6225232bb69d482c14"} Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.006491 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" event={"ID":"de90f9c1-7173-4878-b0c5-6b8734353119","Type":"ContainerStarted","Data":"8eb0afdbd469cc4329747b6116003ed7593943692706cdaf7b4efaa38d9126da"} Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.089104 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.090273 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.093580 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.094490 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9w9vm" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.094496 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.097494 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.097494 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.097556 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.097679 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.100902 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.274930 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275300 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275323 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275347 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275372 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275392 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275427 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275464 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275487 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bfk9\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275521 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.275543 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376551 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376615 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376652 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376677 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376702 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376731 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376758 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376781 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376826 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376870 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.376900 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bfk9\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.377996 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.378340 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.381109 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.393959 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.394717 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.394932 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.395088 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-97fhr" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.395320 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.395534 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.395805 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.396012 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.397171 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.397707 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.398054 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.400233 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.400782 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.400794 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.401661 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.402101 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bfk9\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.402941 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.403772 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.422110 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.478668 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.478842 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479001 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479224 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479261 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479280 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479302 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slb5l\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479401 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479424 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479472 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.479496 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598661 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598759 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598800 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598883 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598947 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.598981 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599021 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599057 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599090 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slb5l\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599437 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599467 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.599748 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.600810 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.600890 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.601026 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.601090 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.604199 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.605441 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.609868 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.613081 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.615878 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slb5l\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.627337 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.720932 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:06:09 crc kubenswrapper[4739]: I1008 22:06:09.783982 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.432003 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.434176 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.437263 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.438857 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.438973 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-77b7q" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.440258 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.441224 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.443787 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.444177 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530523 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530577 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fvt\" (UniqueName: \"kubernetes.io/projected/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kube-api-access-w4fvt\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530597 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kolla-config\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530613 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530714 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530912 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530934 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530955 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-secrets\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.530979 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-default\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632593 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632631 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fvt\" (UniqueName: \"kubernetes.io/projected/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kube-api-access-w4fvt\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632654 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kolla-config\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632670 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632691 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632725 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632744 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632765 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-secrets\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.632823 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-default\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.633355 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.633903 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kolla-config\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.634288 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-default\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.634296 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3c6fc5d3-c48a-4d83-97f8-38d56264d769-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.635320 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c6fc5d3-c48a-4d83-97f8-38d56264d769-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.639722 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.650975 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fvt\" (UniqueName: \"kubernetes.io/projected/3c6fc5d3-c48a-4d83-97f8-38d56264d769-kube-api-access-w4fvt\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.651020 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.652116 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3c6fc5d3-c48a-4d83-97f8-38d56264d769-secrets\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.669242 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3c6fc5d3-c48a-4d83-97f8-38d56264d769\") " pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.755583 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.840378 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.842925 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.845352 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.845703 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4f64g" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.845897 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.846079 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.857785 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937049 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937118 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2532902-2058-4c79-b612-fd2737190f3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937169 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937213 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7hb\" (UniqueName: \"kubernetes.io/projected/e2532902-2058-4c79-b612-fd2737190f3e-kube-api-access-ws7hb\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937255 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937277 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937299 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937335 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:11 crc kubenswrapper[4739]: I1008 22:06:11.937376 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039110 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039170 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039207 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039239 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039279 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039309 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2532902-2058-4c79-b612-fd2737190f3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039331 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039363 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7hb\" (UniqueName: \"kubernetes.io/projected/e2532902-2058-4c79-b612-fd2737190f3e-kube-api-access-ws7hb\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039393 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039607 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.039814 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2532902-2058-4c79-b612-fd2737190f3e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.040514 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.040547 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.040915 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2532902-2058-4c79-b612-fd2737190f3e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.042915 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.044556 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.044790 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2532902-2058-4c79-b612-fd2737190f3e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.057063 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7hb\" (UniqueName: \"kubernetes.io/projected/e2532902-2058-4c79-b612-fd2737190f3e-kube-api-access-ws7hb\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.064575 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2532902-2058-4c79-b612-fd2737190f3e\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.168938 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.475402 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.477068 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.490126 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fc8q4" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.490317 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.490524 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.527663 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.545820 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-kolla-config\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.545892 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.545917 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-config-data\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.545938 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdntx\" (UniqueName: \"kubernetes.io/projected/b6da1726-555b-4905-b565-611392fb8e67-kube-api-access-zdntx\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.545984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.647235 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.647306 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-kolla-config\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.647352 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.647379 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-config-data\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.647401 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdntx\" (UniqueName: \"kubernetes.io/projected/b6da1726-555b-4905-b565-611392fb8e67-kube-api-access-zdntx\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.648464 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-config-data\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.648502 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b6da1726-555b-4905-b565-611392fb8e67-kolla-config\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.650301 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.654810 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6da1726-555b-4905-b565-611392fb8e67-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.663828 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdntx\" (UniqueName: \"kubernetes.io/projected/b6da1726-555b-4905-b565-611392fb8e67-kube-api-access-zdntx\") pod \"memcached-0\" (UID: \"b6da1726-555b-4905-b565-611392fb8e67\") " pod="openstack/memcached-0" Oct 08 22:06:12 crc kubenswrapper[4739]: I1008 22:06:12.816999 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.113491 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.114380 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.116639 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jnz8q" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.128944 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.170739 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwfm\" (UniqueName: \"kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm\") pod \"kube-state-metrics-0\" (UID: \"868268e8-4f60-4c9b-aa4d-7239fae44090\") " pod="openstack/kube-state-metrics-0" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.272750 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwfm\" (UniqueName: \"kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm\") pod \"kube-state-metrics-0\" (UID: \"868268e8-4f60-4c9b-aa4d-7239fae44090\") " pod="openstack/kube-state-metrics-0" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.292186 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwfm\" (UniqueName: \"kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm\") pod \"kube-state-metrics-0\" (UID: \"868268e8-4f60-4c9b-aa4d-7239fae44090\") " pod="openstack/kube-state-metrics-0" Oct 08 22:06:14 crc kubenswrapper[4739]: I1008 22:06:14.433452 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.225031 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mj9gb"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.226623 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.229088 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-sv89p" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.229425 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.229593 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.234500 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mj9gb"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.287127 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-92kk7"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.289215 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.297033 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-92kk7"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340568 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-run\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340628 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-combined-ca-bundle\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340672 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-etc-ovs\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340708 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eacfa01f-eb31-40c2-a163-3356c30772e3-scripts\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340730 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340757 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-log-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340794 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cln8z\" (UniqueName: \"kubernetes.io/projected/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-kube-api-access-cln8z\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340855 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-ovn-controller-tls-certs\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340884 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-lib\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340919 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-log\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340940 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmt7\" (UniqueName: \"kubernetes.io/projected/eacfa01f-eb31-40c2-a163-3356c30772e3-kube-api-access-hfmt7\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340962 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.340985 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-scripts\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.442467 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-ovn-controller-tls-certs\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444003 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-lib\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444049 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-log\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444064 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmt7\" (UniqueName: \"kubernetes.io/projected/eacfa01f-eb31-40c2-a163-3356c30772e3-kube-api-access-hfmt7\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444103 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-scripts\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444160 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-run\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444189 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-combined-ca-bundle\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444229 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-etc-ovs\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444263 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eacfa01f-eb31-40c2-a163-3356c30772e3-scripts\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444277 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444336 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-log-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444363 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cln8z\" (UniqueName: \"kubernetes.io/projected/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-kube-api-access-cln8z\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444497 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444618 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-etc-ovs\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444640 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-lib\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444668 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-run\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444655 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-run\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444739 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-var-log-ovn\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.444875 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/eacfa01f-eb31-40c2-a163-3356c30772e3-var-log\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.446276 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-scripts\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.446475 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eacfa01f-eb31-40c2-a163-3356c30772e3-scripts\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.449544 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-ovn-controller-tls-certs\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.450160 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-combined-ca-bundle\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.462526 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cln8z\" (UniqueName: \"kubernetes.io/projected/0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa-kube-api-access-cln8z\") pod \"ovn-controller-mj9gb\" (UID: \"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa\") " pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.463634 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmt7\" (UniqueName: \"kubernetes.io/projected/eacfa01f-eb31-40c2-a163-3356c30772e3-kube-api-access-hfmt7\") pod \"ovn-controller-ovs-92kk7\" (UID: \"eacfa01f-eb31-40c2-a163-3356c30772e3\") " pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.545730 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.607834 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.747891 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.749205 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.751514 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.751527 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dpxwt" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.751620 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.751790 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.751857 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.772945 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.851754 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.851865 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852059 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852212 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-config\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852386 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852502 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852622 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.852694 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8jh\" (UniqueName: \"kubernetes.io/projected/00537745-c30b-4fa9-be09-0edb09ff7138-kube-api-access-tj8jh\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954578 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954628 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954664 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954694 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-config\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954729 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954753 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954791 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.954837 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8jh\" (UniqueName: \"kubernetes.io/projected/00537745-c30b-4fa9-be09-0edb09ff7138-kube-api-access-tj8jh\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.955518 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.958928 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.961741 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-config\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.962806 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.963294 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00537745-c30b-4fa9-be09-0edb09ff7138-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.964814 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.971862 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00537745-c30b-4fa9-be09-0edb09ff7138-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.977277 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8jh\" (UniqueName: \"kubernetes.io/projected/00537745-c30b-4fa9-be09-0edb09ff7138-kube-api-access-tj8jh\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:18 crc kubenswrapper[4739]: I1008 22:06:18.994560 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"00537745-c30b-4fa9-be09-0edb09ff7138\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:19 crc kubenswrapper[4739]: I1008 22:06:19.072458 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.169697 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.171316 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.173317 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.174186 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-w7s5z" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.174501 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.176329 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.181409 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.277541 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.277844 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278110 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278342 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-config\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278619 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278673 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrr2\" (UniqueName: \"kubernetes.io/projected/15d5a814-0c23-4e0f-b750-9f886dc130b6-kube-api-access-vkrr2\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278868 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.278989 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380454 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-config\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380541 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380579 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrr2\" (UniqueName: \"kubernetes.io/projected/15d5a814-0c23-4e0f-b750-9f886dc130b6-kube-api-access-vkrr2\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380631 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380676 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380723 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380755 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.380778 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.381748 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-config\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.381945 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.382216 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.382834 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15d5a814-0c23-4e0f-b750-9f886dc130b6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.386964 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.386964 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.387034 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d5a814-0c23-4e0f-b750-9f886dc130b6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.399547 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrr2\" (UniqueName: \"kubernetes.io/projected/15d5a814-0c23-4e0f-b750-9f886dc130b6-kube-api-access-vkrr2\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.399823 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"15d5a814-0c23-4e0f-b750-9f886dc130b6\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:20 crc kubenswrapper[4739]: I1008 22:06:20.494733 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.380050 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.741086 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.753396 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.764679 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.774281 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.892684 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mj9gb"] Oct 08 22:06:22 crc kubenswrapper[4739]: W1008 22:06:22.912981 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7dd9f9_b0ec_4795_9ad5_d4787becd6fa.slice/crio-ea521d7a2e91e4086e7f087192f01973bf47a6ea261f777b99f641a1df9dc740 WatchSource:0}: Error finding container ea521d7a2e91e4086e7f087192f01973bf47a6ea261f777b99f641a1df9dc740: Status 404 returned error can't find the container with id ea521d7a2e91e4086e7f087192f01973bf47a6ea261f777b99f641a1df9dc740 Oct 08 22:06:22 crc kubenswrapper[4739]: I1008 22:06:22.939344 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.019851 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-92kk7"] Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.164769 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mj9gb" event={"ID":"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa","Type":"ContainerStarted","Data":"ea521d7a2e91e4086e7f087192f01973bf47a6ea261f777b99f641a1df9dc740"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.165943 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"868268e8-4f60-4c9b-aa4d-7239fae44090","Type":"ContainerStarted","Data":"d6d1426ca0fe6c96a2a0dada67ba6c0b9858396587e5a243320c190e73a01306"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.167117 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3c6fc5d3-c48a-4d83-97f8-38d56264d769","Type":"ContainerStarted","Data":"df9933b0756336c102afa05cd42cffc58936178ab75bb7d6e5231c62051ac6d7"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.168080 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b6da1726-555b-4905-b565-611392fb8e67","Type":"ContainerStarted","Data":"429cb5ce43f5e9307b6084d33e38f82105e651e45ab19cc6451c9dac80fea5b0"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.169338 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerStarted","Data":"5539d51bb745d52a372bfc1118ae17f031ceb2e6ab59d8d875a373818e393084"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.170547 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerStarted","Data":"aef8fed2608376526a808c7ad0f04aa2f9be775c3745e15b3ec84e8a42755479"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.171308 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-92kk7" event={"ID":"eacfa01f-eb31-40c2-a163-3356c30772e3","Type":"ContainerStarted","Data":"84e15f61ba70cd54bfb802a487058173638b7cffa4db029598e78efe9d21717e"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.172071 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2532902-2058-4c79-b612-fd2737190f3e","Type":"ContainerStarted","Data":"9a194a30318fb1faa726ce05951e6780dd92fa7ac95097381923e64182927791"} Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.703198 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:06:23 crc kubenswrapper[4739]: I1008 22:06:23.853706 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:06:23 crc kubenswrapper[4739]: W1008 22:06:23.856752 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d5a814_0c23_4e0f_b750_9f886dc130b6.slice/crio-e2586feacec1bca3a6dea3254fdae5bc4d6a7fc5b66367aa6bf9a2662180dd5e WatchSource:0}: Error finding container e2586feacec1bca3a6dea3254fdae5bc4d6a7fc5b66367aa6bf9a2662180dd5e: Status 404 returned error can't find the container with id e2586feacec1bca3a6dea3254fdae5bc4d6a7fc5b66367aa6bf9a2662180dd5e Oct 08 22:06:24 crc kubenswrapper[4739]: I1008 22:06:24.183728 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"15d5a814-0c23-4e0f-b750-9f886dc130b6","Type":"ContainerStarted","Data":"e2586feacec1bca3a6dea3254fdae5bc4d6a7fc5b66367aa6bf9a2662180dd5e"} Oct 08 22:06:24 crc kubenswrapper[4739]: I1008 22:06:24.185505 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"00537745-c30b-4fa9-be09-0edb09ff7138","Type":"ContainerStarted","Data":"cf28e01160a1040fa68ddef323bfd7d31f4df070357815dbbd9ead2e1102dcc8"} Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.094589 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.095355 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66lt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-nw242_openstack(dd9d9636-02eb-4252-a3d2-0d4e4ecea80d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.096664 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" podUID="dd9d9636-02eb-4252-a3d2-0d4e4ecea80d" Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.377099 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.377294 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqtxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lrvj7_openstack(f18cfb5c-4d80-494f-aeed-a923bf9b49f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:06:26 crc kubenswrapper[4739]: E1008 22:06:26.378554 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" podUID="f18cfb5c-4d80-494f-aeed-a923bf9b49f3" Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.587178 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.712673 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config\") pod \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.712745 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lt6\" (UniqueName: \"kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6\") pod \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\" (UID: \"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d\") " Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.713303 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config" (OuterVolumeSpecName: "config") pod "dd9d9636-02eb-4252-a3d2-0d4e4ecea80d" (UID: "dd9d9636-02eb-4252-a3d2-0d4e4ecea80d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.713421 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.718454 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6" (OuterVolumeSpecName: "kube-api-access-66lt6") pod "dd9d9636-02eb-4252-a3d2-0d4e4ecea80d" (UID: "dd9d9636-02eb-4252-a3d2-0d4e4ecea80d"). InnerVolumeSpecName "kube-api-access-66lt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:26 crc kubenswrapper[4739]: I1008 22:06:26.815104 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66lt6\" (UniqueName: \"kubernetes.io/projected/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d-kube-api-access-66lt6\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.226215 4739 generic.go:334] "Generic (PLEG): container finished" podID="56d84fd1-be79-439a-a63e-349430395229" containerID="a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a" exitCode=0 Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.226267 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" event={"ID":"56d84fd1-be79-439a-a63e-349430395229","Type":"ContainerDied","Data":"a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a"} Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.228286 4739 generic.go:334] "Generic (PLEG): container finished" podID="de90f9c1-7173-4878-b0c5-6b8734353119" containerID="e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a" exitCode=0 Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.228337 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" event={"ID":"de90f9c1-7173-4878-b0c5-6b8734353119","Type":"ContainerDied","Data":"e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a"} Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.229531 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" event={"ID":"dd9d9636-02eb-4252-a3d2-0d4e4ecea80d","Type":"ContainerDied","Data":"e5533674163f53981182322c07e80641ea7230b37301a67c9deadc5604ae4100"} Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.229559 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nw242" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.304500 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.309166 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nw242"] Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.572323 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.641373 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqtxs\" (UniqueName: \"kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs\") pod \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.641708 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc\") pod \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.641828 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config\") pod \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\" (UID: \"f18cfb5c-4d80-494f-aeed-a923bf9b49f3\") " Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.643585 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f18cfb5c-4d80-494f-aeed-a923bf9b49f3" (UID: "f18cfb5c-4d80-494f-aeed-a923bf9b49f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.644131 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config" (OuterVolumeSpecName: "config") pod "f18cfb5c-4d80-494f-aeed-a923bf9b49f3" (UID: "f18cfb5c-4d80-494f-aeed-a923bf9b49f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.663735 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs" (OuterVolumeSpecName: "kube-api-access-zqtxs") pod "f18cfb5c-4d80-494f-aeed-a923bf9b49f3" (UID: "f18cfb5c-4d80-494f-aeed-a923bf9b49f3"). InnerVolumeSpecName "kube-api-access-zqtxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.744037 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqtxs\" (UniqueName: \"kubernetes.io/projected/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-kube-api-access-zqtxs\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.744068 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.744078 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18cfb5c-4d80-494f-aeed-a923bf9b49f3-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:27 crc kubenswrapper[4739]: I1008 22:06:27.831129 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd9d9636-02eb-4252-a3d2-0d4e4ecea80d" path="/var/lib/kubelet/pods/dd9d9636-02eb-4252-a3d2-0d4e4ecea80d/volumes" Oct 08 22:06:28 crc kubenswrapper[4739]: I1008 22:06:28.239187 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" Oct 08 22:06:28 crc kubenswrapper[4739]: I1008 22:06:28.239138 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lrvj7" event={"ID":"f18cfb5c-4d80-494f-aeed-a923bf9b49f3","Type":"ContainerDied","Data":"130d4d8e3728dc1a0706972815ae15739be46cc79b27bc5b38f66c9b8ed60c36"} Oct 08 22:06:28 crc kubenswrapper[4739]: I1008 22:06:28.279869 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:28 crc kubenswrapper[4739]: I1008 22:06:28.284402 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lrvj7"] Oct 08 22:06:29 crc kubenswrapper[4739]: I1008 22:06:29.831173 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18cfb5c-4d80-494f-aeed-a923bf9b49f3" path="/var/lib/kubelet/pods/f18cfb5c-4d80-494f-aeed-a923bf9b49f3/volumes" Oct 08 22:06:29 crc kubenswrapper[4739]: I1008 22:06:29.976486 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hld4g"] Oct 08 22:06:29 crc kubenswrapper[4739]: I1008 22:06:29.978034 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:29 crc kubenswrapper[4739]: I1008 22:06:29.982002 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 22:06:29 crc kubenswrapper[4739]: I1008 22:06:29.990274 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hld4g"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.089752 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.108320 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.110034 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.115045 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.115918 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.115992 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovs-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.116097 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz6wr\" (UniqueName: \"kubernetes.io/projected/bc4ea068-4061-435b-8e62-11b14a3e1ec4-kube-api-access-kz6wr\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.116212 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-combined-ca-bundle\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.116264 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4ea068-4061-435b-8e62-11b14a3e1ec4-config\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.116317 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovn-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.131623 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.213160 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.219888 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovs-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.219958 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.219981 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220050 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz6wr\" (UniqueName: \"kubernetes.io/projected/bc4ea068-4061-435b-8e62-11b14a3e1ec4-kube-api-access-kz6wr\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220083 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lwn\" (UniqueName: \"kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220127 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-combined-ca-bundle\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220182 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4ea068-4061-435b-8e62-11b14a3e1ec4-config\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220256 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovn-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220672 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.220730 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.221009 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovs-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.221163 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bc4ea068-4061-435b-8e62-11b14a3e1ec4-ovn-rundir\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.222221 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4ea068-4061-435b-8e62-11b14a3e1ec4-config\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.230312 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.241391 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.242977 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.248027 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4ea068-4061-435b-8e62-11b14a3e1ec4-combined-ca-bundle\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.248459 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.255592 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz6wr\" (UniqueName: \"kubernetes.io/projected/bc4ea068-4061-435b-8e62-11b14a3e1ec4-kube-api-access-kz6wr\") pod \"ovn-controller-metrics-hld4g\" (UID: \"bc4ea068-4061-435b-8e62-11b14a3e1ec4\") " pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.260160 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.319203 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hld4g" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.322547 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323697 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323737 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323774 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323792 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323814 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323834 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zv6z\" (UniqueName: \"kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323869 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.323891 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lwn\" (UniqueName: \"kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.324597 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.325701 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.325807 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.341444 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lwn\" (UniqueName: \"kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn\") pod \"dnsmasq-dns-7fd796d7df-njps4\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.425990 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.426486 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.426519 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.426545 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zv6z\" (UniqueName: \"kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.426574 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.426878 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.427225 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.427651 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.427781 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.432973 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.458335 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zv6z\" (UniqueName: \"kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z\") pod \"dnsmasq-dns-86db49b7ff-pkhdx\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:30 crc kubenswrapper[4739]: I1008 22:06:30.605853 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.197230 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hld4g"] Oct 08 22:06:36 crc kubenswrapper[4739]: W1008 22:06:36.249576 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc4ea068_4061_435b_8e62_11b14a3e1ec4.slice/crio-2d81055a760421e4244c70b2e49332d2a1b2545c4403e0525d6ea370ae7b0117 WatchSource:0}: Error finding container 2d81055a760421e4244c70b2e49332d2a1b2545c4403e0525d6ea370ae7b0117: Status 404 returned error can't find the container with id 2d81055a760421e4244c70b2e49332d2a1b2545c4403e0525d6ea370ae7b0117 Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.299876 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.311290 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:36 crc kubenswrapper[4739]: W1008 22:06:36.324469 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9452bca1_f672_4a2f_b979_a6b47ea051b8.slice/crio-770515c4be3b4ccff1034bec2cb8fb32091fff738624cda5a827dad914dd84f4 WatchSource:0}: Error finding container 770515c4be3b4ccff1034bec2cb8fb32091fff738624cda5a827dad914dd84f4: Status 404 returned error can't find the container with id 770515c4be3b4ccff1034bec2cb8fb32091fff738624cda5a827dad914dd84f4 Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.333452 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b6da1726-555b-4905-b565-611392fb8e67","Type":"ContainerStarted","Data":"3d84430deebd10672f670e2aa64733112b38ebcb891592e40b14f803a743bffe"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.333982 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.342628 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"15d5a814-0c23-4e0f-b750-9f886dc130b6","Type":"ContainerStarted","Data":"88bf587c54c3a551852edf4b218bc7fa8aac244267ee0e4e84749f9742da1732"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.347748 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3c6fc5d3-c48a-4d83-97f8-38d56264d769","Type":"ContainerStarted","Data":"1966a0619f0b59e96377e0358cf458a7f122837bf1f71947f935b544162c4c5a"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.350745 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"00537745-c30b-4fa9-be09-0edb09ff7138","Type":"ContainerStarted","Data":"ef763e4964a42b9e8f5225f60648b07a1805630c9986acbf725c383f7ce8ef45"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.358726 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.524781158 podStartE2EDuration="24.358704523s" podCreationTimestamp="2025-10-08 22:06:12 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.733675458 +0000 UTC m=+1082.559061218" lastFinishedPulling="2025-10-08 22:06:35.567598843 +0000 UTC m=+1095.392984583" observedRunningTime="2025-10-08 22:06:36.35372841 +0000 UTC m=+1096.179114160" watchObservedRunningTime="2025-10-08 22:06:36.358704523 +0000 UTC m=+1096.184090273" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.359587 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" event={"ID":"de90f9c1-7173-4878-b0c5-6b8734353119","Type":"ContainerStarted","Data":"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.359712 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="dnsmasq-dns" containerID="cri-o://0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a" gracePeriod=10 Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.359838 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.373639 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"868268e8-4f60-4c9b-aa4d-7239fae44090","Type":"ContainerStarted","Data":"576b22ec8a8136b4908a088523224d9593320cdbbc2c82c7ef5afd65a040a8fc"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.373714 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.382706 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" event={"ID":"56d84fd1-be79-439a-a63e-349430395229","Type":"ContainerStarted","Data":"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.382761 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="dnsmasq-dns" containerID="cri-o://c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071" gracePeriod=10 Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.382791 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.387089 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hld4g" event={"ID":"bc4ea068-4061-435b-8e62-11b14a3e1ec4","Type":"ContainerStarted","Data":"2d81055a760421e4244c70b2e49332d2a1b2545c4403e0525d6ea370ae7b0117"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.395160 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-92kk7" event={"ID":"eacfa01f-eb31-40c2-a163-3356c30772e3","Type":"ContainerStarted","Data":"fbf4f5ae95674ab7bfe15f1bdb8d3a0c8cacc259f4936911ebf43aeb50fb1af7"} Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.431881 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" podStartSLOduration=11.802661667 podStartE2EDuration="29.431865645s" podCreationTimestamp="2025-10-08 22:06:07 +0000 UTC" firstStartedPulling="2025-10-08 22:06:08.832595211 +0000 UTC m=+1068.657980961" lastFinishedPulling="2025-10-08 22:06:26.461799179 +0000 UTC m=+1086.287184939" observedRunningTime="2025-10-08 22:06:36.429361943 +0000 UTC m=+1096.254747693" watchObservedRunningTime="2025-10-08 22:06:36.431865645 +0000 UTC m=+1096.257251395" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.449527 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.448612554 podStartE2EDuration="22.44950995s" podCreationTimestamp="2025-10-08 22:06:14 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.736856576 +0000 UTC m=+1082.562242356" lastFinishedPulling="2025-10-08 22:06:35.737753992 +0000 UTC m=+1095.563139752" observedRunningTime="2025-10-08 22:06:36.447322705 +0000 UTC m=+1096.272708455" watchObservedRunningTime="2025-10-08 22:06:36.44950995 +0000 UTC m=+1096.274895700" Oct 08 22:06:36 crc kubenswrapper[4739]: I1008 22:06:36.462192 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" podStartSLOduration=11.00293742 podStartE2EDuration="28.462175401s" podCreationTimestamp="2025-10-08 22:06:08 +0000 UTC" firstStartedPulling="2025-10-08 22:06:08.862507089 +0000 UTC m=+1068.687892839" lastFinishedPulling="2025-10-08 22:06:26.32174504 +0000 UTC m=+1086.147130820" observedRunningTime="2025-10-08 22:06:36.461248068 +0000 UTC m=+1096.286633818" watchObservedRunningTime="2025-10-08 22:06:36.462175401 +0000 UTC m=+1096.287561151" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.030735 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.034050 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185105 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config\") pod \"de90f9c1-7173-4878-b0c5-6b8734353119\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185238 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs8th\" (UniqueName: \"kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th\") pod \"de90f9c1-7173-4878-b0c5-6b8734353119\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185366 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bkg\" (UniqueName: \"kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg\") pod \"56d84fd1-be79-439a-a63e-349430395229\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185406 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config\") pod \"56d84fd1-be79-439a-a63e-349430395229\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185438 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc\") pod \"de90f9c1-7173-4878-b0c5-6b8734353119\" (UID: \"de90f9c1-7173-4878-b0c5-6b8734353119\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.185472 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc\") pod \"56d84fd1-be79-439a-a63e-349430395229\" (UID: \"56d84fd1-be79-439a-a63e-349430395229\") " Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.192303 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg" (OuterVolumeSpecName: "kube-api-access-m4bkg") pod "56d84fd1-be79-439a-a63e-349430395229" (UID: "56d84fd1-be79-439a-a63e-349430395229"). InnerVolumeSpecName "kube-api-access-m4bkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.193351 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th" (OuterVolumeSpecName: "kube-api-access-hs8th") pod "de90f9c1-7173-4878-b0c5-6b8734353119" (UID: "de90f9c1-7173-4878-b0c5-6b8734353119"). InnerVolumeSpecName "kube-api-access-hs8th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.226368 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config" (OuterVolumeSpecName: "config") pod "56d84fd1-be79-439a-a63e-349430395229" (UID: "56d84fd1-be79-439a-a63e-349430395229"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.230155 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de90f9c1-7173-4878-b0c5-6b8734353119" (UID: "de90f9c1-7173-4878-b0c5-6b8734353119"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.238199 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config" (OuterVolumeSpecName: "config") pod "de90f9c1-7173-4878-b0c5-6b8734353119" (UID: "de90f9c1-7173-4878-b0c5-6b8734353119"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.240489 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56d84fd1-be79-439a-a63e-349430395229" (UID: "56d84fd1-be79-439a-a63e-349430395229"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287487 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs8th\" (UniqueName: \"kubernetes.io/projected/de90f9c1-7173-4878-b0c5-6b8734353119-kube-api-access-hs8th\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287516 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bkg\" (UniqueName: \"kubernetes.io/projected/56d84fd1-be79-439a-a63e-349430395229-kube-api-access-m4bkg\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287527 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287537 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287547 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56d84fd1-be79-439a-a63e-349430395229-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.287555 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de90f9c1-7173-4878-b0c5-6b8734353119-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.405689 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mj9gb" event={"ID":"0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa","Type":"ContainerStarted","Data":"b014d3a95294f87f7fdc66365fe96c942262b3b40a62f157f9080ff050ee6828"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.406331 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mj9gb" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.412935 4739 generic.go:334] "Generic (PLEG): container finished" podID="56d84fd1-be79-439a-a63e-349430395229" containerID="c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071" exitCode=0 Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.412998 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.413091 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" event={"ID":"56d84fd1-be79-439a-a63e-349430395229","Type":"ContainerDied","Data":"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.413134 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tknt8" event={"ID":"56d84fd1-be79-439a-a63e-349430395229","Type":"ContainerDied","Data":"c4756d0264b1c2a2d903a746ad335a2379c72ed3204f7c6225232bb69d482c14"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.413305 4739 scope.go:117] "RemoveContainer" containerID="c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.422540 4739 generic.go:334] "Generic (PLEG): container finished" podID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerID="b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b" exitCode=0 Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.422632 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" event={"ID":"1233f9c5-10c7-4244-98ec-17d6eb4c1c15","Type":"ContainerDied","Data":"b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.422661 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" event={"ID":"1233f9c5-10c7-4244-98ec-17d6eb4c1c15","Type":"ContainerStarted","Data":"66a1029b42cc6eb9f14e1939109414a943372b4c57f8e819cc854bf73eb99320"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.436103 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mj9gb" podStartSLOduration=6.703194896 podStartE2EDuration="19.436079743s" podCreationTimestamp="2025-10-08 22:06:18 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.914780837 +0000 UTC m=+1082.740166587" lastFinishedPulling="2025-10-08 22:06:35.647665674 +0000 UTC m=+1095.473051434" observedRunningTime="2025-10-08 22:06:37.429807339 +0000 UTC m=+1097.255193099" watchObservedRunningTime="2025-10-08 22:06:37.436079743 +0000 UTC m=+1097.261465493" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.436733 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-92kk7" event={"ID":"eacfa01f-eb31-40c2-a163-3356c30772e3","Type":"ContainerDied","Data":"fbf4f5ae95674ab7bfe15f1bdb8d3a0c8cacc259f4936911ebf43aeb50fb1af7"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.436700 4739 generic.go:334] "Generic (PLEG): container finished" podID="eacfa01f-eb31-40c2-a163-3356c30772e3" containerID="fbf4f5ae95674ab7bfe15f1bdb8d3a0c8cacc259f4936911ebf43aeb50fb1af7" exitCode=0 Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.441471 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2532902-2058-4c79-b612-fd2737190f3e","Type":"ContainerStarted","Data":"f12aa79e8afde836e91d3d6bbfd4caea2715e44b400ad6374236b55b13f565f7"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.444226 4739 generic.go:334] "Generic (PLEG): container finished" podID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerID="94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f" exitCode=0 Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.444283 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" event={"ID":"9452bca1-f672-4a2f-b979-a6b47ea051b8","Type":"ContainerDied","Data":"94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.444306 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" event={"ID":"9452bca1-f672-4a2f-b979-a6b47ea051b8","Type":"ContainerStarted","Data":"770515c4be3b4ccff1034bec2cb8fb32091fff738624cda5a827dad914dd84f4"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.453677 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerStarted","Data":"0f3d74649a0a14550547d1e1b433c90cc6ac77609fefff35fec33655f62131a6"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.455874 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerStarted","Data":"8461e62c94cfe5fa13ef8ece05f1f0dd7b1bea2f3dd5a46cf4358cc585e53964"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.460322 4739 generic.go:334] "Generic (PLEG): container finished" podID="de90f9c1-7173-4878-b0c5-6b8734353119" containerID="0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a" exitCode=0 Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.460376 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.460372 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" event={"ID":"de90f9c1-7173-4878-b0c5-6b8734353119","Type":"ContainerDied","Data":"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.460554 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p5rwg" event={"ID":"de90f9c1-7173-4878-b0c5-6b8734353119","Type":"ContainerDied","Data":"8eb0afdbd469cc4329747b6116003ed7593943692706cdaf7b4efaa38d9126da"} Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.492331 4739 scope.go:117] "RemoveContainer" containerID="a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.496968 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.502353 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tknt8"] Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.570168 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.585364 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p5rwg"] Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.654925 4739 scope.go:117] "RemoveContainer" containerID="c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071" Oct 08 22:06:37 crc kubenswrapper[4739]: E1008 22:06:37.655862 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071\": container with ID starting with c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071 not found: ID does not exist" containerID="c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.655933 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071"} err="failed to get container status \"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071\": rpc error: code = NotFound desc = could not find container \"c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071\": container with ID starting with c21a90e6a1bce26e5a3d934e3c723c88d3e1d7b52a5257413ac2fb37d409b071 not found: ID does not exist" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.655969 4739 scope.go:117] "RemoveContainer" containerID="a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a" Oct 08 22:06:37 crc kubenswrapper[4739]: E1008 22:06:37.658186 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a\": container with ID starting with a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a not found: ID does not exist" containerID="a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.658218 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a"} err="failed to get container status \"a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a\": rpc error: code = NotFound desc = could not find container \"a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a\": container with ID starting with a686208cacdcadf8924b2ee3fdd53fa07cc995aca1dfa4d2739d7712d029564a not found: ID does not exist" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.658253 4739 scope.go:117] "RemoveContainer" containerID="0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.729115 4739 scope.go:117] "RemoveContainer" containerID="e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.751491 4739 scope.go:117] "RemoveContainer" containerID="0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a" Oct 08 22:06:37 crc kubenswrapper[4739]: E1008 22:06:37.751810 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a\": container with ID starting with 0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a not found: ID does not exist" containerID="0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.751842 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a"} err="failed to get container status \"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a\": rpc error: code = NotFound desc = could not find container \"0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a\": container with ID starting with 0bd4436ddd8d8e8f5c258ac9d24d0d825c51dc11bfe7121e11219c9f4ad9002a not found: ID does not exist" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.751871 4739 scope.go:117] "RemoveContainer" containerID="e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a" Oct 08 22:06:37 crc kubenswrapper[4739]: E1008 22:06:37.754262 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a\": container with ID starting with e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a not found: ID does not exist" containerID="e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.754291 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a"} err="failed to get container status \"e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a\": rpc error: code = NotFound desc = could not find container \"e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a\": container with ID starting with e7a7bc13f41e1c259ca06242c223ae48fe46eae39e564c757d7bebd2ad69898a not found: ID does not exist" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.839890 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d84fd1-be79-439a-a63e-349430395229" path="/var/lib/kubelet/pods/56d84fd1-be79-439a-a63e-349430395229/volumes" Oct 08 22:06:37 crc kubenswrapper[4739]: I1008 22:06:37.840808 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" path="/var/lib/kubelet/pods/de90f9c1-7173-4878-b0c5-6b8734353119/volumes" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.476035 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-92kk7" event={"ID":"eacfa01f-eb31-40c2-a163-3356c30772e3","Type":"ContainerStarted","Data":"443ce73d12644c9b365de0d35a841bb78553a2968192bc933fe077dd5dd8e105"} Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.476589 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-92kk7" event={"ID":"eacfa01f-eb31-40c2-a163-3356c30772e3","Type":"ContainerStarted","Data":"b04db4dc0a63cad6fde6fdbecccfdb547e8e2d64165df1658de7d978cef1be1c"} Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.476608 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.483634 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" event={"ID":"9452bca1-f672-4a2f-b979-a6b47ea051b8","Type":"ContainerStarted","Data":"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8"} Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.483739 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.485764 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" event={"ID":"1233f9c5-10c7-4244-98ec-17d6eb4c1c15","Type":"ContainerStarted","Data":"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6"} Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.499013 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-92kk7" podStartSLOduration=8.066372594 podStartE2EDuration="20.498994547s" podCreationTimestamp="2025-10-08 22:06:18 +0000 UTC" firstStartedPulling="2025-10-08 22:06:23.024855128 +0000 UTC m=+1082.850240878" lastFinishedPulling="2025-10-08 22:06:35.457477081 +0000 UTC m=+1095.282862831" observedRunningTime="2025-10-08 22:06:38.494627769 +0000 UTC m=+1098.320013589" watchObservedRunningTime="2025-10-08 22:06:38.498994547 +0000 UTC m=+1098.324380297" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.527292 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" podStartSLOduration=8.527274473 podStartE2EDuration="8.527274473s" podCreationTimestamp="2025-10-08 22:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:06:38.524661069 +0000 UTC m=+1098.350046859" watchObservedRunningTime="2025-10-08 22:06:38.527274473 +0000 UTC m=+1098.352660223" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.547006 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" podStartSLOduration=8.546995558999999 podStartE2EDuration="8.546995559s" podCreationTimestamp="2025-10-08 22:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:06:38.54581704 +0000 UTC m=+1098.371202820" watchObservedRunningTime="2025-10-08 22:06:38.546995559 +0000 UTC m=+1098.372381309" Oct 08 22:06:38 crc kubenswrapper[4739]: I1008 22:06:38.608622 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:06:39 crc kubenswrapper[4739]: I1008 22:06:39.491996 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:40 crc kubenswrapper[4739]: I1008 22:06:40.502968 4739 generic.go:334] "Generic (PLEG): container finished" podID="e2532902-2058-4c79-b612-fd2737190f3e" containerID="f12aa79e8afde836e91d3d6bbfd4caea2715e44b400ad6374236b55b13f565f7" exitCode=0 Oct 08 22:06:40 crc kubenswrapper[4739]: I1008 22:06:40.503044 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2532902-2058-4c79-b612-fd2737190f3e","Type":"ContainerDied","Data":"f12aa79e8afde836e91d3d6bbfd4caea2715e44b400ad6374236b55b13f565f7"} Oct 08 22:06:40 crc kubenswrapper[4739]: I1008 22:06:40.509397 4739 generic.go:334] "Generic (PLEG): container finished" podID="3c6fc5d3-c48a-4d83-97f8-38d56264d769" containerID="1966a0619f0b59e96377e0358cf458a7f122837bf1f71947f935b544162c4c5a" exitCode=0 Oct 08 22:06:40 crc kubenswrapper[4739]: I1008 22:06:40.509465 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3c6fc5d3-c48a-4d83-97f8-38d56264d769","Type":"ContainerDied","Data":"1966a0619f0b59e96377e0358cf458a7f122837bf1f71947f935b544162c4c5a"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.528685 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2532902-2058-4c79-b612-fd2737190f3e","Type":"ContainerStarted","Data":"34d0c0b3578a92e782a5cbb016a3ab6bf1aa966d9c25c6c911a6a4c58b419fde"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.533197 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3c6fc5d3-c48a-4d83-97f8-38d56264d769","Type":"ContainerStarted","Data":"20d34b006a37754334c68e6add249d1e1aff231d0f1d38c381e7f2fc99fca377"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.537755 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"00537745-c30b-4fa9-be09-0edb09ff7138","Type":"ContainerStarted","Data":"0531f21c1985e5976fbd5f02039a3e6abfc97f5799f8ca76c9107aa294e08a30"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.541104 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"15d5a814-0c23-4e0f-b750-9f886dc130b6","Type":"ContainerStarted","Data":"3077337ee1d97cd50da3be007ab0229e5f4fa3e40f309e814f6a6e1383e7ee1a"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.543681 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hld4g" event={"ID":"bc4ea068-4061-435b-8e62-11b14a3e1ec4","Type":"ContainerStarted","Data":"394594dfa2b0cab3d52bedaadcd230256e6840682c27103fc075453ec229179a"} Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.569547 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.188906949 podStartE2EDuration="31.569517767s" podCreationTimestamp="2025-10-08 22:06:10 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.385424852 +0000 UTC m=+1082.210810602" lastFinishedPulling="2025-10-08 22:06:35.76603567 +0000 UTC m=+1095.591421420" observedRunningTime="2025-10-08 22:06:41.567124848 +0000 UTC m=+1101.392510638" watchObservedRunningTime="2025-10-08 22:06:41.569517767 +0000 UTC m=+1101.394903567" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.619189 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.007090549 podStartE2EDuration="24.619113018s" podCreationTimestamp="2025-10-08 22:06:17 +0000 UTC" firstStartedPulling="2025-10-08 22:06:23.723743447 +0000 UTC m=+1083.549129197" lastFinishedPulling="2025-10-08 22:06:40.335765886 +0000 UTC m=+1100.161151666" observedRunningTime="2025-10-08 22:06:41.611554322 +0000 UTC m=+1101.436940102" watchObservedRunningTime="2025-10-08 22:06:41.619113018 +0000 UTC m=+1101.444498808" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.652682 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.193445038 podStartE2EDuration="22.652637473s" podCreationTimestamp="2025-10-08 22:06:19 +0000 UTC" firstStartedPulling="2025-10-08 22:06:23.859775447 +0000 UTC m=+1083.685161197" lastFinishedPulling="2025-10-08 22:06:40.318967882 +0000 UTC m=+1100.144353632" observedRunningTime="2025-10-08 22:06:41.640465673 +0000 UTC m=+1101.465851443" watchObservedRunningTime="2025-10-08 22:06:41.652637473 +0000 UTC m=+1101.478023243" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.672456 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hld4g" podStartSLOduration=8.606248983 podStartE2EDuration="12.672432681s" podCreationTimestamp="2025-10-08 22:06:29 +0000 UTC" firstStartedPulling="2025-10-08 22:06:36.252869717 +0000 UTC m=+1096.078255467" lastFinishedPulling="2025-10-08 22:06:40.319053375 +0000 UTC m=+1100.144439165" observedRunningTime="2025-10-08 22:06:41.66632094 +0000 UTC m=+1101.491706730" watchObservedRunningTime="2025-10-08 22:06:41.672432681 +0000 UTC m=+1101.497818461" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.697218 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.736427122 podStartE2EDuration="31.69718545s" podCreationTimestamp="2025-10-08 22:06:10 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.743222873 +0000 UTC m=+1082.568608643" lastFinishedPulling="2025-10-08 22:06:35.703981221 +0000 UTC m=+1095.529366971" observedRunningTime="2025-10-08 22:06:41.693400887 +0000 UTC m=+1101.518786657" watchObservedRunningTime="2025-10-08 22:06:41.69718545 +0000 UTC m=+1101.522571210" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.756104 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 22:06:41 crc kubenswrapper[4739]: I1008 22:06:41.760259 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 22:06:42 crc kubenswrapper[4739]: I1008 22:06:42.170021 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:42 crc kubenswrapper[4739]: I1008 22:06:42.170080 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:42 crc kubenswrapper[4739]: I1008 22:06:42.819370 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 22:06:43 crc kubenswrapper[4739]: I1008 22:06:43.072930 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:43 crc kubenswrapper[4739]: I1008 22:06:43.139557 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:43 crc kubenswrapper[4739]: I1008 22:06:43.558322 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:43 crc kubenswrapper[4739]: I1008 22:06:43.597911 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.468056 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.492410 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.492627 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="dnsmasq-dns" containerID="cri-o://36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8" gracePeriod=10 Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.496377 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.497627 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.556199 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:06:44 crc kubenswrapper[4739]: E1008 22:06:44.556991 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557011 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: E1008 22:06:44.557022 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557029 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: E1008 22:06:44.557047 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="init" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557054 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="init" Oct 08 22:06:44 crc kubenswrapper[4739]: E1008 22:06:44.557070 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="init" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557075 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="init" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557235 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d84fd1-be79-439a-a63e-349430395229" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.557255 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="de90f9c1-7173-4878-b0c5-6b8734353119" containerName="dnsmasq-dns" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.559068 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.567760 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.574281 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.574640 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.644763 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.659624 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.659692 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.659713 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.659779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6snc\" (UniqueName: \"kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.659809 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.762395 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.763036 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.763357 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6snc\" (UniqueName: \"kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.764235 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.764927 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.763954 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.764839 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.764196 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.765747 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.797992 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6snc\" (UniqueName: \"kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc\") pod \"dnsmasq-dns-698758b865-mbp55\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.799387 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.800572 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.804024 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.804352 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.807358 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hksp6" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.807718 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.814277 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868092 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868165 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-scripts\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868188 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpkzh\" (UniqueName: \"kubernetes.io/projected/9c19423c-cec2-4fbf-b2bf-97a99db03043-kube-api-access-kpkzh\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868213 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868230 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-config\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868378 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.868404 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.962486 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969774 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969818 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-scripts\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969835 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpkzh\" (UniqueName: \"kubernetes.io/projected/9c19423c-cec2-4fbf-b2bf-97a99db03043-kube-api-access-kpkzh\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969859 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969877 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-config\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969953 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.969976 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.971494 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.971896 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-config\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.972051 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c19423c-cec2-4fbf-b2bf-97a99db03043-scripts\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.974574 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.975537 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.978356 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c19423c-cec2-4fbf-b2bf-97a99db03043-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:44 crc kubenswrapper[4739]: I1008 22:06:44.993123 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpkzh\" (UniqueName: \"kubernetes.io/projected/9c19423c-cec2-4fbf-b2bf-97a99db03043-kube-api-access-kpkzh\") pod \"ovn-northd-0\" (UID: \"9c19423c-cec2-4fbf-b2bf-97a99db03043\") " pod="openstack/ovn-northd-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.134845 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.154002 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.278454 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc\") pod \"9452bca1-f672-4a2f-b979-a6b47ea051b8\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.278844 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb\") pod \"9452bca1-f672-4a2f-b979-a6b47ea051b8\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.278951 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config\") pod \"9452bca1-f672-4a2f-b979-a6b47ea051b8\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.279007 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7lwn\" (UniqueName: \"kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn\") pod \"9452bca1-f672-4a2f-b979-a6b47ea051b8\" (UID: \"9452bca1-f672-4a2f-b979-a6b47ea051b8\") " Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.288719 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn" (OuterVolumeSpecName: "kube-api-access-k7lwn") pod "9452bca1-f672-4a2f-b979-a6b47ea051b8" (UID: "9452bca1-f672-4a2f-b979-a6b47ea051b8"). InnerVolumeSpecName "kube-api-access-k7lwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.324274 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config" (OuterVolumeSpecName: "config") pod "9452bca1-f672-4a2f-b979-a6b47ea051b8" (UID: "9452bca1-f672-4a2f-b979-a6b47ea051b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.329128 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9452bca1-f672-4a2f-b979-a6b47ea051b8" (UID: "9452bca1-f672-4a2f-b979-a6b47ea051b8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.332646 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9452bca1-f672-4a2f-b979-a6b47ea051b8" (UID: "9452bca1-f672-4a2f-b979-a6b47ea051b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.380372 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.380404 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.380415 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9452bca1-f672-4a2f-b979-a6b47ea051b8-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.380424 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7lwn\" (UniqueName: \"kubernetes.io/projected/9452bca1-f672-4a2f-b979-a6b47ea051b8-kube-api-access-k7lwn\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.449705 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:06:45 crc kubenswrapper[4739]: W1008 22:06:45.457843 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5158ad_8718_400f_9d89_a28f901a953f.slice/crio-3cdb329208f223f940d2731ad98967539eeef1e85f9f9e613b7da9637ada0369 WatchSource:0}: Error finding container 3cdb329208f223f940d2731ad98967539eeef1e85f9f9e613b7da9637ada0369: Status 404 returned error can't find the container with id 3cdb329208f223f940d2731ad98967539eeef1e85f9f9e613b7da9637ada0369 Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.575404 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.580507 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mbp55" event={"ID":"ad5158ad-8718-400f-9d89-a28f901a953f","Type":"ContainerStarted","Data":"3cdb329208f223f940d2731ad98967539eeef1e85f9f9e613b7da9637ada0369"} Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.584275 4739 generic.go:334] "Generic (PLEG): container finished" podID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerID="36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8" exitCode=0 Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.584365 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.584470 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" event={"ID":"9452bca1-f672-4a2f-b979-a6b47ea051b8","Type":"ContainerDied","Data":"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8"} Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.584515 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-njps4" event={"ID":"9452bca1-f672-4a2f-b979-a6b47ea051b8","Type":"ContainerDied","Data":"770515c4be3b4ccff1034bec2cb8fb32091fff738624cda5a827dad914dd84f4"} Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.584539 4739 scope.go:117] "RemoveContainer" containerID="36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.606053 4739 scope.go:117] "RemoveContainer" containerID="94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.607314 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.624653 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.630212 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-njps4"] Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.647781 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.648114 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="init" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.648131 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="init" Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.648541 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="dnsmasq-dns" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.648577 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="dnsmasq-dns" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.648786 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" containerName="dnsmasq-dns" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.654888 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.656226 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bsxnn" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.658047 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.658350 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.658481 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.698635 4739 scope.go:117] "RemoveContainer" containerID="36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8" Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.699261 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8\": container with ID starting with 36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8 not found: ID does not exist" containerID="36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.699331 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8"} err="failed to get container status \"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8\": rpc error: code = NotFound desc = could not find container \"36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8\": container with ID starting with 36b8d6260d16d61ed6cec4428c3517e33c8a43d0c31f6a9e1f48137946447ff8 not found: ID does not exist" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.699363 4739 scope.go:117] "RemoveContainer" containerID="94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f" Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.699714 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f\": container with ID starting with 94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f not found: ID does not exist" containerID="94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.699767 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f"} err="failed to get container status \"94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f\": rpc error: code = NotFound desc = could not find container \"94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f\": container with ID starting with 94056f24dd8270a6f5a831f77c299a582cdae0c5e8d9b0e0343e88d8f3c1514f not found: ID does not exist" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.719101 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.794672 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nphmx\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-kube-api-access-nphmx\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.794768 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.794994 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-lock\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.795278 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.795345 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-cache\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.832060 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9452bca1-f672-4a2f-b979-a6b47ea051b8" path="/var/lib/kubelet/pods/9452bca1-f672-4a2f-b979-a6b47ea051b8/volumes" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.837431 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.896953 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-cache\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.897072 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nphmx\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-kube-api-access-nphmx\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.897216 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.897261 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-lock\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.897334 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.897461 4739 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.897480 4739 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:06:45 crc kubenswrapper[4739]: E1008 22:06:45.897532 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift podName:229f0c98-b6d6-415b-b34a-6ffcd2a0ed52 nodeName:}" failed. No retries permitted until 2025-10-08 22:06:46.397514501 +0000 UTC m=+1106.222900251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift") pod "swift-storage-0" (UID: "229f0c98-b6d6-415b-b34a-6ffcd2a0ed52") : configmap "swift-ring-files" not found Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.897922 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.898401 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-cache\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.898587 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-lock\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.899982 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.936420 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:45 crc kubenswrapper[4739]: I1008 22:06:45.936838 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nphmx\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-kube-api-access-nphmx\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:46 crc kubenswrapper[4739]: I1008 22:06:46.408535 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:46 crc kubenswrapper[4739]: E1008 22:06:46.408788 4739 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:06:46 crc kubenswrapper[4739]: E1008 22:06:46.408807 4739 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:06:46 crc kubenswrapper[4739]: E1008 22:06:46.408884 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift podName:229f0c98-b6d6-415b-b34a-6ffcd2a0ed52 nodeName:}" failed. No retries permitted until 2025-10-08 22:06:47.408866803 +0000 UTC m=+1107.234252553 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift") pod "swift-storage-0" (UID: "229f0c98-b6d6-415b-b34a-6ffcd2a0ed52") : configmap "swift-ring-files" not found Oct 08 22:06:46 crc kubenswrapper[4739]: I1008 22:06:46.596795 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c19423c-cec2-4fbf-b2bf-97a99db03043","Type":"ContainerStarted","Data":"1c128f4035bc1ffbabe6a7dfb079902b9c2375f9169cae09071cf192876ea896"} Oct 08 22:06:46 crc kubenswrapper[4739]: I1008 22:06:46.598580 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad5158ad-8718-400f-9d89-a28f901a953f" containerID="b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71" exitCode=0 Oct 08 22:06:46 crc kubenswrapper[4739]: I1008 22:06:46.598724 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mbp55" event={"ID":"ad5158ad-8718-400f-9d89-a28f901a953f","Type":"ContainerDied","Data":"b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71"} Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.427346 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:47 crc kubenswrapper[4739]: E1008 22:06:47.427609 4739 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:06:47 crc kubenswrapper[4739]: E1008 22:06:47.427941 4739 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:06:47 crc kubenswrapper[4739]: E1008 22:06:47.428008 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift podName:229f0c98-b6d6-415b-b34a-6ffcd2a0ed52 nodeName:}" failed. No retries permitted until 2025-10-08 22:06:49.427983778 +0000 UTC m=+1109.253369548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift") pod "swift-storage-0" (UID: "229f0c98-b6d6-415b-b34a-6ffcd2a0ed52") : configmap "swift-ring-files" not found Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.610738 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mbp55" event={"ID":"ad5158ad-8718-400f-9d89-a28f901a953f","Type":"ContainerStarted","Data":"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867"} Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.611941 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.619390 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c19423c-cec2-4fbf-b2bf-97a99db03043","Type":"ContainerStarted","Data":"c900a6512a07a39190c3ea436f6ab1de6c2eccc703da08082f1c083c2d3bc0a1"} Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.619421 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c19423c-cec2-4fbf-b2bf-97a99db03043","Type":"ContainerStarted","Data":"4e43e07be904fc541a9aba61c744557851ea00e70f6e0647f6f7b8c2f493945d"} Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.619970 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.667886 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.419467273 podStartE2EDuration="3.667862684s" podCreationTimestamp="2025-10-08 22:06:44 +0000 UTC" firstStartedPulling="2025-10-08 22:06:45.589323922 +0000 UTC m=+1105.414709662" lastFinishedPulling="2025-10-08 22:06:46.837719323 +0000 UTC m=+1106.663105073" observedRunningTime="2025-10-08 22:06:47.666338537 +0000 UTC m=+1107.491724317" watchObservedRunningTime="2025-10-08 22:06:47.667862684 +0000 UTC m=+1107.493248444" Oct 08 22:06:47 crc kubenswrapper[4739]: I1008 22:06:47.671406 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-mbp55" podStartSLOduration=3.671392332 podStartE2EDuration="3.671392332s" podCreationTimestamp="2025-10-08 22:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:06:47.642582782 +0000 UTC m=+1107.467968542" watchObservedRunningTime="2025-10-08 22:06:47.671392332 +0000 UTC m=+1107.496778092" Oct 08 22:06:48 crc kubenswrapper[4739]: I1008 22:06:48.299395 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:48 crc kubenswrapper[4739]: I1008 22:06:48.367647 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.469575 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:49 crc kubenswrapper[4739]: E1008 22:06:49.469772 4739 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:06:49 crc kubenswrapper[4739]: E1008 22:06:49.471264 4739 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:06:49 crc kubenswrapper[4739]: E1008 22:06:49.471329 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift podName:229f0c98-b6d6-415b-b34a-6ffcd2a0ed52 nodeName:}" failed. No retries permitted until 2025-10-08 22:06:53.471308883 +0000 UTC m=+1113.296694633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift") pod "swift-storage-0" (UID: "229f0c98-b6d6-415b-b34a-6ffcd2a0ed52") : configmap "swift-ring-files" not found Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.610111 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zwwls"] Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.611821 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.614920 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.615905 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.621386 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.649941 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zwwls"] Oct 08 22:06:49 crc kubenswrapper[4739]: E1008 22:06:49.650632 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-55dhk ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-55dhk ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-zwwls" podUID="fbc97d91-e58a-4165-a6aa-12c65617d40a" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.659241 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-djtdv"] Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.660902 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.671558 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-djtdv"] Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.677332 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zwwls"] Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678065 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678100 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678138 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dhk\" (UniqueName: \"kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678195 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678224 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678267 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678284 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678317 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678333 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678369 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678427 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678443 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678464 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.678498 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdpbq\" (UniqueName: \"kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779640 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779693 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779721 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779748 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779768 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779802 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779874 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779894 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779923 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779960 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdpbq\" (UniqueName: \"kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.779980 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.780008 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.780064 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dhk\" (UniqueName: \"kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.780098 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.780599 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.781657 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.781964 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.782642 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.782710 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.783032 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.790530 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.790979 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.791085 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.794695 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.803588 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dhk\" (UniqueName: \"kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.803877 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf\") pod \"swift-ring-rebalance-zwwls\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.804928 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.813328 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdpbq\" (UniqueName: \"kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq\") pod \"swift-ring-rebalance-djtdv\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:49 crc kubenswrapper[4739]: I1008 22:06:49.975735 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.265725 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-djtdv"] Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.645379 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-djtdv" event={"ID":"2b6bdd10-ace2-453a-b0c9-d89051620215","Type":"ContainerStarted","Data":"224e581941317e602978c0f4e8bcde5fb5ba9d4e80d06aefc06f586c310571b8"} Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.645400 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.673060 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703485 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703548 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55dhk\" (UniqueName: \"kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703599 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703641 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703686 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703712 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.703781 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf\") pod \"fbc97d91-e58a-4165-a6aa-12c65617d40a\" (UID: \"fbc97d91-e58a-4165-a6aa-12c65617d40a\") " Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.704112 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.704466 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts" (OuterVolumeSpecName: "scripts") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.704683 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.709261 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.709303 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.710190 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.710706 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk" (OuterVolumeSpecName: "kube-api-access-55dhk") pod "fbc97d91-e58a-4165-a6aa-12c65617d40a" (UID: "fbc97d91-e58a-4165-a6aa-12c65617d40a"). InnerVolumeSpecName "kube-api-access-55dhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805370 4739 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805405 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805417 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55dhk\" (UniqueName: \"kubernetes.io/projected/fbc97d91-e58a-4165-a6aa-12c65617d40a-kube-api-access-55dhk\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805431 4739 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc97d91-e58a-4165-a6aa-12c65617d40a-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805439 4739 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805451 4739 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc97d91-e58a-4165-a6aa-12c65617d40a-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:50 crc kubenswrapper[4739]: I1008 22:06:50.805459 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc97d91-e58a-4165-a6aa-12c65617d40a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:51 crc kubenswrapper[4739]: I1008 22:06:51.653645 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zwwls" Oct 08 22:06:51 crc kubenswrapper[4739]: I1008 22:06:51.700891 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zwwls"] Oct 08 22:06:51 crc kubenswrapper[4739]: I1008 22:06:51.704205 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-zwwls"] Oct 08 22:06:51 crc kubenswrapper[4739]: I1008 22:06:51.837227 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc97d91-e58a-4165-a6aa-12c65617d40a" path="/var/lib/kubelet/pods/fbc97d91-e58a-4165-a6aa-12c65617d40a/volumes" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.498695 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lvj7x"] Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.499645 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.510448 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lvj7x"] Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.639828 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nx9n\" (UniqueName: \"kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n\") pod \"keystone-db-create-lvj7x\" (UID: \"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb\") " pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.712280 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-x9jwq"] Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.716244 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.733634 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x9jwq"] Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.741678 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nx9n\" (UniqueName: \"kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n\") pod \"keystone-db-create-lvj7x\" (UID: \"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb\") " pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.764723 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nx9n\" (UniqueName: \"kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n\") pod \"keystone-db-create-lvj7x\" (UID: \"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb\") " pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.830822 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.847443 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrkf\" (UniqueName: \"kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf\") pod \"placement-db-create-x9jwq\" (UID: \"f6294838-8381-41e1-9384-0084edf1dac0\") " pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.953915 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrkf\" (UniqueName: \"kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf\") pod \"placement-db-create-x9jwq\" (UID: \"f6294838-8381-41e1-9384-0084edf1dac0\") " pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:52 crc kubenswrapper[4739]: I1008 22:06:52.975432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrkf\" (UniqueName: \"kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf\") pod \"placement-db-create-x9jwq\" (UID: \"f6294838-8381-41e1-9384-0084edf1dac0\") " pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.016087 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wwm4z"] Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.018458 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.023902 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wwm4z"] Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.041568 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.156381 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wb7\" (UniqueName: \"kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7\") pod \"glance-db-create-wwm4z\" (UID: \"742f0e7b-0ccd-4f1e-83ae-027d75053522\") " pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.257701 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6wb7\" (UniqueName: \"kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7\") pod \"glance-db-create-wwm4z\" (UID: \"742f0e7b-0ccd-4f1e-83ae-027d75053522\") " pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.274917 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6wb7\" (UniqueName: \"kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7\") pod \"glance-db-create-wwm4z\" (UID: \"742f0e7b-0ccd-4f1e-83ae-027d75053522\") " pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.354809 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:53 crc kubenswrapper[4739]: I1008 22:06:53.561650 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:06:53 crc kubenswrapper[4739]: E1008 22:06:53.561858 4739 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:06:53 crc kubenswrapper[4739]: E1008 22:06:53.561885 4739 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:06:53 crc kubenswrapper[4739]: E1008 22:06:53.561946 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift podName:229f0c98-b6d6-415b-b34a-6ffcd2a0ed52 nodeName:}" failed. No retries permitted until 2025-10-08 22:07:01.561924612 +0000 UTC m=+1121.387310362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift") pod "swift-storage-0" (UID: "229f0c98-b6d6-415b-b34a-6ffcd2a0ed52") : configmap "swift-ring-files" not found Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.614244 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wwm4z"] Oct 08 22:06:54 crc kubenswrapper[4739]: W1008 22:06:54.616990 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod742f0e7b_0ccd_4f1e_83ae_027d75053522.slice/crio-01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d WatchSource:0}: Error finding container 01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d: Status 404 returned error can't find the container with id 01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.680954 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lvj7x"] Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.700115 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x9jwq"] Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.712826 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lvj7x" event={"ID":"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb","Type":"ContainerStarted","Data":"3cc4e4588945997489dcde619540393ffd8d4aecf073ef36106cac25edefbe80"} Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.715686 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wwm4z" event={"ID":"742f0e7b-0ccd-4f1e-83ae-027d75053522","Type":"ContainerStarted","Data":"01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d"} Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.723518 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-djtdv" event={"ID":"2b6bdd10-ace2-453a-b0c9-d89051620215","Type":"ContainerStarted","Data":"019005ed70752bc75cd44c6aa4a6f36e0c8e91e8c6f8824e9bd1436af9e846d3"} Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.755641 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-djtdv" podStartSLOduration=1.880998447 podStartE2EDuration="5.755607856s" podCreationTimestamp="2025-10-08 22:06:49 +0000 UTC" firstStartedPulling="2025-10-08 22:06:50.276885631 +0000 UTC m=+1110.102271381" lastFinishedPulling="2025-10-08 22:06:54.15149504 +0000 UTC m=+1113.976880790" observedRunningTime="2025-10-08 22:06:54.745910397 +0000 UTC m=+1114.571296157" watchObservedRunningTime="2025-10-08 22:06:54.755607856 +0000 UTC m=+1114.580993606" Oct 08 22:06:54 crc kubenswrapper[4739]: I1008 22:06:54.964395 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.052844 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.053639 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="dnsmasq-dns" containerID="cri-o://cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6" gracePeriod=10 Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.604958 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.719430 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zv6z\" (UniqueName: \"kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z\") pod \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.720235 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config\") pod \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.720286 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb\") pod \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.720354 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb\") pod \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.720386 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc\") pod \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\" (UID: \"1233f9c5-10c7-4244-98ec-17d6eb4c1c15\") " Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.749391 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z" (OuterVolumeSpecName: "kube-api-access-8zv6z") pod "1233f9c5-10c7-4244-98ec-17d6eb4c1c15" (UID: "1233f9c5-10c7-4244-98ec-17d6eb4c1c15"). InnerVolumeSpecName "kube-api-access-8zv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.788275 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1233f9c5-10c7-4244-98ec-17d6eb4c1c15" (UID: "1233f9c5-10c7-4244-98ec-17d6eb4c1c15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.788862 4739 generic.go:334] "Generic (PLEG): container finished" podID="f6294838-8381-41e1-9384-0084edf1dac0" containerID="49d4dfd8262ce588a05f8cab369e73f96045ff2b6e128dc4fb7342a6486fc134" exitCode=0 Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.789002 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x9jwq" event={"ID":"f6294838-8381-41e1-9384-0084edf1dac0","Type":"ContainerDied","Data":"49d4dfd8262ce588a05f8cab369e73f96045ff2b6e128dc4fb7342a6486fc134"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.789066 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x9jwq" event={"ID":"f6294838-8381-41e1-9384-0084edf1dac0","Type":"ContainerStarted","Data":"9fe78b05655c0a8ab562dff3dafecd4d4038853dd807d360292cda65223f7cff"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.795376 4739 generic.go:334] "Generic (PLEG): container finished" podID="742f0e7b-0ccd-4f1e-83ae-027d75053522" containerID="c2edb8a23dd8e4a6d4d868aede065fdaf60e67f07924a6c44c27fdfd63ce97c1" exitCode=0 Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.795459 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wwm4z" event={"ID":"742f0e7b-0ccd-4f1e-83ae-027d75053522","Type":"ContainerDied","Data":"c2edb8a23dd8e4a6d4d868aede065fdaf60e67f07924a6c44c27fdfd63ce97c1"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.797734 4739 generic.go:334] "Generic (PLEG): container finished" podID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerID="cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6" exitCode=0 Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.797800 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" event={"ID":"1233f9c5-10c7-4244-98ec-17d6eb4c1c15","Type":"ContainerDied","Data":"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.797803 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.797857 4739 scope.go:117] "RemoveContainer" containerID="cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.797844 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pkhdx" event={"ID":"1233f9c5-10c7-4244-98ec-17d6eb4c1c15","Type":"ContainerDied","Data":"66a1029b42cc6eb9f14e1939109414a943372b4c57f8e819cc854bf73eb99320"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.801028 4739 generic.go:334] "Generic (PLEG): container finished" podID="88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" containerID="a6c66b3dec97437d2e5a4f0be4dac9e3c4bb3e7627cc41666afaa825b2768bad" exitCode=0 Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.801920 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lvj7x" event={"ID":"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb","Type":"ContainerDied","Data":"a6c66b3dec97437d2e5a4f0be4dac9e3c4bb3e7627cc41666afaa825b2768bad"} Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.809014 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1233f9c5-10c7-4244-98ec-17d6eb4c1c15" (UID: "1233f9c5-10c7-4244-98ec-17d6eb4c1c15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.828009 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zv6z\" (UniqueName: \"kubernetes.io/projected/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-kube-api-access-8zv6z\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.828049 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.828062 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.836907 4739 scope.go:117] "RemoveContainer" containerID="b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.848953 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config" (OuterVolumeSpecName: "config") pod "1233f9c5-10c7-4244-98ec-17d6eb4c1c15" (UID: "1233f9c5-10c7-4244-98ec-17d6eb4c1c15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.857488 4739 scope.go:117] "RemoveContainer" containerID="cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6" Oct 08 22:06:55 crc kubenswrapper[4739]: E1008 22:06:55.858026 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6\": container with ID starting with cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6 not found: ID does not exist" containerID="cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.858080 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6"} err="failed to get container status \"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6\": rpc error: code = NotFound desc = could not find container \"cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6\": container with ID starting with cdc62cd6e2f8a63b8ea2e4523f3a82618d85b3d6da312992da59a28adb3aa3d6 not found: ID does not exist" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.858119 4739 scope.go:117] "RemoveContainer" containerID="b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b" Oct 08 22:06:55 crc kubenswrapper[4739]: E1008 22:06:55.858969 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b\": container with ID starting with b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b not found: ID does not exist" containerID="b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.858999 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b"} err="failed to get container status \"b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b\": rpc error: code = NotFound desc = could not find container \"b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b\": container with ID starting with b84b1476242c41664567a51365deac7cad53b42d80c5d682a46b4db4a12dbd4b not found: ID does not exist" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.872496 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1233f9c5-10c7-4244-98ec-17d6eb4c1c15" (UID: "1233f9c5-10c7-4244-98ec-17d6eb4c1c15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.930352 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:55 crc kubenswrapper[4739]: I1008 22:06:55.930601 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1233f9c5-10c7-4244-98ec-17d6eb4c1c15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:56 crc kubenswrapper[4739]: I1008 22:06:56.137542 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:56 crc kubenswrapper[4739]: I1008 22:06:56.144726 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pkhdx"] Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.357880 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.369532 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.371576 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.463591 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6wb7\" (UniqueName: \"kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7\") pod \"742f0e7b-0ccd-4f1e-83ae-027d75053522\" (UID: \"742f0e7b-0ccd-4f1e-83ae-027d75053522\") " Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.463924 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwrkf\" (UniqueName: \"kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf\") pod \"f6294838-8381-41e1-9384-0084edf1dac0\" (UID: \"f6294838-8381-41e1-9384-0084edf1dac0\") " Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.464093 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nx9n\" (UniqueName: \"kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n\") pod \"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb\" (UID: \"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb\") " Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.471119 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7" (OuterVolumeSpecName: "kube-api-access-w6wb7") pod "742f0e7b-0ccd-4f1e-83ae-027d75053522" (UID: "742f0e7b-0ccd-4f1e-83ae-027d75053522"). InnerVolumeSpecName "kube-api-access-w6wb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.471619 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf" (OuterVolumeSpecName: "kube-api-access-kwrkf") pod "f6294838-8381-41e1-9384-0084edf1dac0" (UID: "f6294838-8381-41e1-9384-0084edf1dac0"). InnerVolumeSpecName "kube-api-access-kwrkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.478597 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n" (OuterVolumeSpecName: "kube-api-access-8nx9n") pod "88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" (UID: "88f9b47a-3b20-488b-93c6-ca8ca9beb2eb"). InnerVolumeSpecName "kube-api-access-8nx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.565765 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nx9n\" (UniqueName: \"kubernetes.io/projected/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb-kube-api-access-8nx9n\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.565805 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6wb7\" (UniqueName: \"kubernetes.io/projected/742f0e7b-0ccd-4f1e-83ae-027d75053522-kube-api-access-w6wb7\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.565818 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwrkf\" (UniqueName: \"kubernetes.io/projected/f6294838-8381-41e1-9384-0084edf1dac0-kube-api-access-kwrkf\") on node \"crc\" DevicePath \"\"" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.828898 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wwm4z" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.832329 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" path="/var/lib/kubelet/pods/1233f9c5-10c7-4244-98ec-17d6eb4c1c15/volumes" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.833177 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wwm4z" event={"ID":"742f0e7b-0ccd-4f1e-83ae-027d75053522","Type":"ContainerDied","Data":"01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d"} Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.833206 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01974dece12340bf2ddbc33b60520131aa821785eadda0d72fff2ab6225ae57d" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.836725 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lvj7x" event={"ID":"88f9b47a-3b20-488b-93c6-ca8ca9beb2eb","Type":"ContainerDied","Data":"3cc4e4588945997489dcde619540393ffd8d4aecf073ef36106cac25edefbe80"} Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.836773 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc4e4588945997489dcde619540393ffd8d4aecf073ef36106cac25edefbe80" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.836855 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lvj7x" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.844229 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x9jwq" Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.844134 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x9jwq" event={"ID":"f6294838-8381-41e1-9384-0084edf1dac0","Type":"ContainerDied","Data":"9fe78b05655c0a8ab562dff3dafecd4d4038853dd807d360292cda65223f7cff"} Oct 08 22:06:57 crc kubenswrapper[4739]: I1008 22:06:57.844368 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fe78b05655c0a8ab562dff3dafecd4d4038853dd807d360292cda65223f7cff" Oct 08 22:06:57 crc kubenswrapper[4739]: E1008 22:06:57.973713 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod742f0e7b_0ccd_4f1e_83ae_027d75053522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f9b47a_3b20_488b_93c6_ca8ca9beb2eb.slice\": RecentStats: unable to find data in memory cache]" Oct 08 22:07:00 crc kubenswrapper[4739]: I1008 22:07:00.332642 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 22:07:01 crc kubenswrapper[4739]: I1008 22:07:01.643684 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:07:01 crc kubenswrapper[4739]: I1008 22:07:01.662247 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/229f0c98-b6d6-415b-b34a-6ffcd2a0ed52-etc-swift\") pod \"swift-storage-0\" (UID: \"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52\") " pod="openstack/swift-storage-0" Oct 08 22:07:01 crc kubenswrapper[4739]: I1008 22:07:01.913179 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:07:02 crc kubenswrapper[4739]: I1008 22:07:02.585551 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:07:02 crc kubenswrapper[4739]: I1008 22:07:02.900628 4739 generic.go:334] "Generic (PLEG): container finished" podID="2b6bdd10-ace2-453a-b0c9-d89051620215" containerID="019005ed70752bc75cd44c6aa4a6f36e0c8e91e8c6f8824e9bd1436af9e846d3" exitCode=0 Oct 08 22:07:02 crc kubenswrapper[4739]: I1008 22:07:02.900749 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-djtdv" event={"ID":"2b6bdd10-ace2-453a-b0c9-d89051620215","Type":"ContainerDied","Data":"019005ed70752bc75cd44c6aa4a6f36e0c8e91e8c6f8824e9bd1436af9e846d3"} Oct 08 22:07:02 crc kubenswrapper[4739]: I1008 22:07:02.902796 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"fe88249a6fadf728aaa86f7501d53b60f61aa73b5400cc4d17abd4a45ceac70a"} Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.145884 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8be7-account-create-q4j26"] Oct 08 22:07:03 crc kubenswrapper[4739]: E1008 22:07:03.146240 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742f0e7b-0ccd-4f1e-83ae-027d75053522" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146259 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="742f0e7b-0ccd-4f1e-83ae-027d75053522" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: E1008 22:07:03.146274 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146282 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: E1008 22:07:03.146293 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6294838-8381-41e1-9384-0084edf1dac0" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146301 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6294838-8381-41e1-9384-0084edf1dac0" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: E1008 22:07:03.146315 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="init" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146320 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="init" Oct 08 22:07:03 crc kubenswrapper[4739]: E1008 22:07:03.146336 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="dnsmasq-dns" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146341 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="dnsmasq-dns" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146483 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="1233f9c5-10c7-4244-98ec-17d6eb4c1c15" containerName="dnsmasq-dns" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146516 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146541 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6294838-8381-41e1-9384-0084edf1dac0" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.146562 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="742f0e7b-0ccd-4f1e-83ae-027d75053522" containerName="mariadb-database-create" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.147100 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.150286 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.158816 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8be7-account-create-q4j26"] Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.310702 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77cg\" (UniqueName: \"kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg\") pod \"glance-8be7-account-create-q4j26\" (UID: \"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18\") " pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.413569 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f77cg\" (UniqueName: \"kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg\") pod \"glance-8be7-account-create-q4j26\" (UID: \"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18\") " pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.450724 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77cg\" (UniqueName: \"kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg\") pod \"glance-8be7-account-create-q4j26\" (UID: \"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18\") " pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:03 crc kubenswrapper[4739]: I1008 22:07:03.516994 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.007505 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8be7-account-create-q4j26"] Oct 08 22:07:04 crc kubenswrapper[4739]: W1008 22:07:04.123366 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4e2fed8_546a_4d0f_bb94_381bb1d4bd18.slice/crio-7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271 WatchSource:0}: Error finding container 7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271: Status 404 returned error can't find the container with id 7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271 Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.259599 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.432460 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.432667 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.432730 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.432807 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdpbq\" (UniqueName: \"kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.432912 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.433030 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.433086 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts\") pod \"2b6bdd10-ace2-453a-b0c9-d89051620215\" (UID: \"2b6bdd10-ace2-453a-b0c9-d89051620215\") " Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.433332 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.433640 4739 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b6bdd10-ace2-453a-b0c9-d89051620215-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.434654 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.440121 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq" (OuterVolumeSpecName: "kube-api-access-xdpbq") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "kube-api-access-xdpbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.444446 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.457303 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.462650 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.477920 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts" (OuterVolumeSpecName: "scripts") pod "2b6bdd10-ace2-453a-b0c9-d89051620215" (UID: "2b6bdd10-ace2-453a-b0c9-d89051620215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535765 4739 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535816 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b6bdd10-ace2-453a-b0c9-d89051620215-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535835 4739 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535851 4739 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535870 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdpbq\" (UniqueName: \"kubernetes.io/projected/2b6bdd10-ace2-453a-b0c9-d89051620215-kube-api-access-xdpbq\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.535890 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6bdd10-ace2-453a-b0c9-d89051620215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.926134 4739 generic.go:334] "Generic (PLEG): container finished" podID="d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" containerID="3e85e74fdcce96ea3da43342900d7c79964611a77ff3ec1634dfe655fad9615c" exitCode=0 Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.926790 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be7-account-create-q4j26" event={"ID":"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18","Type":"ContainerDied","Data":"3e85e74fdcce96ea3da43342900d7c79964611a77ff3ec1634dfe655fad9615c"} Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.926871 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be7-account-create-q4j26" event={"ID":"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18","Type":"ContainerStarted","Data":"7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271"} Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.930385 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-djtdv" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.930528 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-djtdv" event={"ID":"2b6bdd10-ace2-453a-b0c9-d89051620215","Type":"ContainerDied","Data":"224e581941317e602978c0f4e8bcde5fb5ba9d4e80d06aefc06f586c310571b8"} Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.930696 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="224e581941317e602978c0f4e8bcde5fb5ba9d4e80d06aefc06f586c310571b8" Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.932177 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"34cb90d15335f251f6a909895d6d2b33193007133186518f439db7e2acf13ded"} Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.932201 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"915946d7175fb60bde6dcb633b1101cfde5894b720fee078c7e7e55b97f60635"} Oct 08 22:07:04 crc kubenswrapper[4739]: I1008 22:07:04.932212 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"12621356a4b671a29b4b9bc56a4c1d98d1106a6409e8da09630b32a1e759a83a"} Oct 08 22:07:05 crc kubenswrapper[4739]: I1008 22:07:05.947994 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"9c3f4c1d41abf1eb31e8a22b2c4178b114d916bb6101cc434e77371ad0b6935a"} Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.313353 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.469538 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f77cg\" (UniqueName: \"kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg\") pod \"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18\" (UID: \"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18\") " Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.478963 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg" (OuterVolumeSpecName: "kube-api-access-f77cg") pod "d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" (UID: "d4e2fed8-546a-4d0f-bb94-381bb1d4bd18"). InnerVolumeSpecName "kube-api-access-f77cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.572035 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f77cg\" (UniqueName: \"kubernetes.io/projected/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18-kube-api-access-f77cg\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.963215 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8be7-account-create-q4j26" Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.963204 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8be7-account-create-q4j26" event={"ID":"d4e2fed8-546a-4d0f-bb94-381bb1d4bd18","Type":"ContainerDied","Data":"7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271"} Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.963413 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7688ff79b4c4070d9840e679aeb620810ddd351049e87b5943724150351fc271" Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.968134 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"754e44aa57d4dced19fb63bd2bd72f5a45e61fc7dedfc6f208d94628e36f0344"} Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.968205 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"b417e87b0d29c0846bf394907396107715176b23577940d0fc62993f8c25b1c0"} Oct 08 22:07:06 crc kubenswrapper[4739]: I1008 22:07:06.968222 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"52495300f6e12db949db1e58c318743423297501b903c72c7efa642b74c8e1f1"} Oct 08 22:07:07 crc kubenswrapper[4739]: I1008 22:07:07.982137 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"8d1c4b2a32e4132071f7ef45fff34460ed0f8e2db1a72dfdf91236498570a903"} Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.231038 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-254hj"] Oct 08 22:07:08 crc kubenswrapper[4739]: E1008 22:07:08.231370 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" containerName="mariadb-account-create" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.231386 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" containerName="mariadb-account-create" Oct 08 22:07:08 crc kubenswrapper[4739]: E1008 22:07:08.231400 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6bdd10-ace2-453a-b0c9-d89051620215" containerName="swift-ring-rebalance" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.231406 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6bdd10-ace2-453a-b0c9-d89051620215" containerName="swift-ring-rebalance" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.231602 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" containerName="mariadb-account-create" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.231615 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6bdd10-ace2-453a-b0c9-d89051620215" containerName="swift-ring-rebalance" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.232195 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.236695 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.236805 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zhb2c" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.253467 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-254hj"] Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.430561 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.431278 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.431535 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.431731 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85fz\" (UniqueName: \"kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.536077 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.536259 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85fz\" (UniqueName: \"kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.536388 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.536559 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.550991 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.551234 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.552612 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.561834 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85fz\" (UniqueName: \"kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz\") pod \"glance-db-sync-254hj\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.636246 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mj9gb" podUID="0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa" containerName="ovn-controller" probeResult="failure" output=< Oct 08 22:07:08 crc kubenswrapper[4739]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 22:07:08 crc kubenswrapper[4739]: > Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.662661 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.663975 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-92kk7" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.859175 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-254hj" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.905244 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mj9gb-config-sl8tr"] Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.906947 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.913887 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.916696 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mj9gb-config-sl8tr"] Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948339 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948433 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948562 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948691 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948731 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6mc\" (UniqueName: \"kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:08 crc kubenswrapper[4739]: I1008 22:07:08.948759 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.050778 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.051176 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6mc\" (UniqueName: \"kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.051199 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.051263 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.051305 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.051343 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.052094 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.052360 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.052481 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.052745 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.054109 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.083318 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6mc\" (UniqueName: \"kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc\") pod \"ovn-controller-mj9gb-config-sl8tr\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.282958 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.495290 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-254hj"] Oct 08 22:07:09 crc kubenswrapper[4739]: W1008 22:07:09.497835 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f94af6_8eda_46ff_be13_dc11b2f52790.slice/crio-9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1 WatchSource:0}: Error finding container 9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1: Status 404 returned error can't find the container with id 9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1 Oct 08 22:07:09 crc kubenswrapper[4739]: I1008 22:07:09.815921 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mj9gb-config-sl8tr"] Oct 08 22:07:09 crc kubenswrapper[4739]: W1008 22:07:09.820354 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc103a4a2_63a9_4c14_93da_78f82fa9fc3f.slice/crio-2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2 WatchSource:0}: Error finding container 2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2: Status 404 returned error can't find the container with id 2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2 Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.004521 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-254hj" event={"ID":"b8f94af6-8eda-46ff-be13-dc11b2f52790","Type":"ContainerStarted","Data":"9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1"} Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.006661 4739 generic.go:334] "Generic (PLEG): container finished" podID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerID="0f3d74649a0a14550547d1e1b433c90cc6ac77609fefff35fec33655f62131a6" exitCode=0 Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.006714 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerDied","Data":"0f3d74649a0a14550547d1e1b433c90cc6ac77609fefff35fec33655f62131a6"} Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.016579 4739 generic.go:334] "Generic (PLEG): container finished" podID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerID="8461e62c94cfe5fa13ef8ece05f1f0dd7b1bea2f3dd5a46cf4358cc585e53964" exitCode=0 Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.016724 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerDied","Data":"8461e62c94cfe5fa13ef8ece05f1f0dd7b1bea2f3dd5a46cf4358cc585e53964"} Oct 08 22:07:10 crc kubenswrapper[4739]: I1008 22:07:10.018919 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mj9gb-config-sl8tr" event={"ID":"c103a4a2-63a9-4c14-93da-78f82fa9fc3f","Type":"ContainerStarted","Data":"2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2"} Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.030095 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerStarted","Data":"ed3bef55901da71ee502980a09f6088f4a2b5887918d42628f5e09c4667fbdf3"} Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.031502 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.032824 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerStarted","Data":"01562c609d85e86bb08c30b79d913bf9f75b03afd4c54e6010518901b35c9807"} Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.034590 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.045425 4739 generic.go:334] "Generic (PLEG): container finished" podID="c103a4a2-63a9-4c14-93da-78f82fa9fc3f" containerID="ab9a08a5cffaaaba12c20ad815ccf4ef630c59c414d9b5cd65dc8ce869144e59" exitCode=0 Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.045463 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mj9gb-config-sl8tr" event={"ID":"c103a4a2-63a9-4c14-93da-78f82fa9fc3f","Type":"ContainerDied","Data":"ab9a08a5cffaaaba12c20ad815ccf4ef630c59c414d9b5cd65dc8ce869144e59"} Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.055458 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.225771398 podStartE2EDuration="1m3.055441039s" podCreationTimestamp="2025-10-08 22:06:08 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.742062464 +0000 UTC m=+1082.567448224" lastFinishedPulling="2025-10-08 22:06:35.571732085 +0000 UTC m=+1095.397117865" observedRunningTime="2025-10-08 22:07:11.051705447 +0000 UTC m=+1130.877091197" watchObservedRunningTime="2025-10-08 22:07:11.055441039 +0000 UTC m=+1130.880826789" Oct 08 22:07:11 crc kubenswrapper[4739]: I1008 22:07:11.081013 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.329992105 podStartE2EDuration="1m3.080995598s" podCreationTimestamp="2025-10-08 22:06:08 +0000 UTC" firstStartedPulling="2025-10-08 22:06:22.954591578 +0000 UTC m=+1082.779977328" lastFinishedPulling="2025-10-08 22:06:35.705595031 +0000 UTC m=+1095.530980821" observedRunningTime="2025-10-08 22:07:11.080136778 +0000 UTC m=+1130.905522528" watchObservedRunningTime="2025-10-08 22:07:11.080995598 +0000 UTC m=+1130.906381338" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.061256 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"a3d9a8d7185a536abfdac5f19018dbe3e95d520a532fdd62ac21affd8525f318"} Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.061742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"f0d2b73cf43bfeaf82a5a303264efe3ae4a1bea84cbbc06b7750bd227667a638"} Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.061754 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"a05d429409d6afe2b54866ca965b9a040d703bcd79b711aa82a61c0dc7adc95a"} Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.061862 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"f22bc37503d113e329c5cd4fa14a779b871e42649966420c492fee5c64d3788e"} Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.455507 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.544246 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1291-account-create-zsqws"] Oct 08 22:07:12 crc kubenswrapper[4739]: E1008 22:07:12.544636 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c103a4a2-63a9-4c14-93da-78f82fa9fc3f" containerName="ovn-config" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.544649 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c103a4a2-63a9-4c14-93da-78f82fa9fc3f" containerName="ovn-config" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.548285 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c103a4a2-63a9-4c14-93da-78f82fa9fc3f" containerName="ovn-config" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.548960 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.551956 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.558603 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1291-account-create-zsqws"] Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.618745 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.618881 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.618887 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.618992 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619024 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj6mc\" (UniqueName: \"kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619088 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619120 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn\") pod \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\" (UID: \"c103a4a2-63a9-4c14-93da-78f82fa9fc3f\") " Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619427 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk674\" (UniqueName: \"kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674\") pod \"keystone-1291-account-create-zsqws\" (UID: \"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d\") " pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619550 4739 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619606 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.619633 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run" (OuterVolumeSpecName: "var-run") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.620074 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.620249 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts" (OuterVolumeSpecName: "scripts") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.624092 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc" (OuterVolumeSpecName: "kube-api-access-mj6mc") pod "c103a4a2-63a9-4c14-93da-78f82fa9fc3f" (UID: "c103a4a2-63a9-4c14-93da-78f82fa9fc3f"). InnerVolumeSpecName "kube-api-access-mj6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720539 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk674\" (UniqueName: \"kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674\") pod \"keystone-1291-account-create-zsqws\" (UID: \"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d\") " pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720726 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720737 4739 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720746 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj6mc\" (UniqueName: \"kubernetes.io/projected/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-kube-api-access-mj6mc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720756 4739 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.720764 4739 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c103a4a2-63a9-4c14-93da-78f82fa9fc3f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.752677 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk674\" (UniqueName: \"kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674\") pod \"keystone-1291-account-create-zsqws\" (UID: \"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d\") " pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.840868 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1026-account-create-p8z6l"] Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.842091 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.845530 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.879553 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.886005 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1026-account-create-p8z6l"] Oct 08 22:07:12 crc kubenswrapper[4739]: I1008 22:07:12.927305 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk99z\" (UniqueName: \"kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z\") pod \"placement-1026-account-create-p8z6l\" (UID: \"1ec4887c-3f0f-4ad2-a187-ecf79caa824f\") " pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.029900 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk99z\" (UniqueName: \"kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z\") pod \"placement-1026-account-create-p8z6l\" (UID: \"1ec4887c-3f0f-4ad2-a187-ecf79caa824f\") " pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.105269 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk99z\" (UniqueName: \"kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z\") pod \"placement-1026-account-create-p8z6l\" (UID: \"1ec4887c-3f0f-4ad2-a187-ecf79caa824f\") " pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.153441 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"89216a61547a184c9b96eb9915aa6def18d60391a797e0b3bffd64f05f24e0ca"} Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.154987 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mj9gb-config-sl8tr" event={"ID":"c103a4a2-63a9-4c14-93da-78f82fa9fc3f","Type":"ContainerDied","Data":"2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2"} Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.155028 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbc2eaf9dd1a3f6723ce72c0295588b1641ad4b1d0c164b7517a1171e715ee2" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.155053 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mj9gb-config-sl8tr" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.159342 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.197343 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1291-account-create-zsqws"] Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.561413 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mj9gb-config-sl8tr"] Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.565680 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mj9gb-config-sl8tr"] Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.606463 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mj9gb" Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.683580 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1026-account-create-p8z6l"] Oct 08 22:07:13 crc kubenswrapper[4739]: W1008 22:07:13.694983 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec4887c_3f0f_4ad2_a187_ecf79caa824f.slice/crio-59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707 WatchSource:0}: Error finding container 59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707: Status 404 returned error can't find the container with id 59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707 Oct 08 22:07:13 crc kubenswrapper[4739]: I1008 22:07:13.836228 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c103a4a2-63a9-4c14-93da-78f82fa9fc3f" path="/var/lib/kubelet/pods/c103a4a2-63a9-4c14-93da-78f82fa9fc3f/volumes" Oct 08 22:07:14 crc kubenswrapper[4739]: I1008 22:07:14.165794 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1026-account-create-p8z6l" event={"ID":"1ec4887c-3f0f-4ad2-a187-ecf79caa824f","Type":"ContainerStarted","Data":"59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707"} Oct 08 22:07:14 crc kubenswrapper[4739]: I1008 22:07:14.169996 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1291-account-create-zsqws" event={"ID":"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d","Type":"ContainerStarted","Data":"7e831f61a6ade636ecafedbd0c557d257fa4b288ebdaf90b3d8ec6df2c961abb"} Oct 08 22:07:15 crc kubenswrapper[4739]: I1008 22:07:15.182569 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1291-account-create-zsqws" event={"ID":"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d","Type":"ContainerStarted","Data":"a79b6943eb60b6351feb24e91006cd341881c19d90644416102de4b20d764ffb"} Oct 08 22:07:15 crc kubenswrapper[4739]: I1008 22:07:15.191954 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"3596c9c706737531727170699a31c77b23e442896da45f5acc59c6ebe803fe91"} Oct 08 22:07:15 crc kubenswrapper[4739]: I1008 22:07:15.194348 4739 generic.go:334] "Generic (PLEG): container finished" podID="1ec4887c-3f0f-4ad2-a187-ecf79caa824f" containerID="8e040d0fe1f33fac07a8d169396c70327f6a0603999eb0e13bdeb7e61e1240a6" exitCode=0 Oct 08 22:07:15 crc kubenswrapper[4739]: I1008 22:07:15.194409 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1026-account-create-p8z6l" event={"ID":"1ec4887c-3f0f-4ad2-a187-ecf79caa824f","Type":"ContainerDied","Data":"8e040d0fe1f33fac07a8d169396c70327f6a0603999eb0e13bdeb7e61e1240a6"} Oct 08 22:07:15 crc kubenswrapper[4739]: I1008 22:07:15.204680 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1291-account-create-zsqws" podStartSLOduration=3.204661841 podStartE2EDuration="3.204661841s" podCreationTimestamp="2025-10-08 22:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:15.200433467 +0000 UTC m=+1135.025819217" watchObservedRunningTime="2025-10-08 22:07:15.204661841 +0000 UTC m=+1135.030047591" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.230587 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"229f0c98-b6d6-415b-b34a-6ffcd2a0ed52","Type":"ContainerStarted","Data":"2b8e3f4a57752ffb7086cd0a8a8a08d6d6ebb7d41a788ec5f1061fcb1bb1c9c2"} Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.233910 4739 generic.go:334] "Generic (PLEG): container finished" podID="7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" containerID="a79b6943eb60b6351feb24e91006cd341881c19d90644416102de4b20d764ffb" exitCode=0 Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.233976 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1291-account-create-zsqws" event={"ID":"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d","Type":"ContainerDied","Data":"a79b6943eb60b6351feb24e91006cd341881c19d90644416102de4b20d764ffb"} Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.273955 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.790489992 podStartE2EDuration="32.273918081s" podCreationTimestamp="2025-10-08 22:06:44 +0000 UTC" firstStartedPulling="2025-10-08 22:07:02.589018299 +0000 UTC m=+1122.414404049" lastFinishedPulling="2025-10-08 22:07:11.072446388 +0000 UTC m=+1130.897832138" observedRunningTime="2025-10-08 22:07:16.273216833 +0000 UTC m=+1136.098602583" watchObservedRunningTime="2025-10-08 22:07:16.273918081 +0000 UTC m=+1136.099303831" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.535064 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.536601 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.538224 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.555058 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.557288 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606641 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk99z\" (UniqueName: \"kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z\") pod \"1ec4887c-3f0f-4ad2-a187-ecf79caa824f\" (UID: \"1ec4887c-3f0f-4ad2-a187-ecf79caa824f\") " Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606803 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606839 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606864 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606897 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzph\" (UniqueName: \"kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606936 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.606991 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.633839 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z" (OuterVolumeSpecName: "kube-api-access-xk99z") pod "1ec4887c-3f0f-4ad2-a187-ecf79caa824f" (UID: "1ec4887c-3f0f-4ad2-a187-ecf79caa824f"). InnerVolumeSpecName "kube-api-access-xk99z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.708784 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.708828 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.708862 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzph\" (UniqueName: \"kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.708905 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.708995 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.709023 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.709120 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk99z\" (UniqueName: \"kubernetes.io/projected/1ec4887c-3f0f-4ad2-a187-ecf79caa824f-kube-api-access-xk99z\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.709893 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.709922 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.710185 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.710328 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.710540 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.727118 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzph\" (UniqueName: \"kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph\") pod \"dnsmasq-dns-77585f5f8c-vvmw5\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:16 crc kubenswrapper[4739]: I1008 22:07:16.871373 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:17 crc kubenswrapper[4739]: I1008 22:07:17.242123 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1026-account-create-p8z6l" event={"ID":"1ec4887c-3f0f-4ad2-a187-ecf79caa824f","Type":"ContainerDied","Data":"59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707"} Oct 08 22:07:17 crc kubenswrapper[4739]: I1008 22:07:17.242223 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59d1711f70b770b0a21020035ca607ee86279979e4f3d811c4b47562b6e28707" Oct 08 22:07:17 crc kubenswrapper[4739]: I1008 22:07:17.243229 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1026-account-create-p8z6l" Oct 08 22:07:23 crc kubenswrapper[4739]: I1008 22:07:23.451218 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:23 crc kubenswrapper[4739]: I1008 22:07:23.546206 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk674\" (UniqueName: \"kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674\") pod \"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d\" (UID: \"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d\") " Oct 08 22:07:23 crc kubenswrapper[4739]: I1008 22:07:23.560979 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674" (OuterVolumeSpecName: "kube-api-access-mk674") pod "7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" (UID: "7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d"). InnerVolumeSpecName "kube-api-access-mk674". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:23 crc kubenswrapper[4739]: I1008 22:07:23.649602 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk674\" (UniqueName: \"kubernetes.io/projected/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d-kube-api-access-mk674\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:23 crc kubenswrapper[4739]: I1008 22:07:23.907734 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:07:23 crc kubenswrapper[4739]: W1008 22:07:23.915772 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841ddc2d_90ad_446e_a869_fe02ec8ce059.slice/crio-002dee23d61a98ad749345d4d1997757afb7a079de688bd7529ca63ad40581cc WatchSource:0}: Error finding container 002dee23d61a98ad749345d4d1997757afb7a079de688bd7529ca63ad40581cc: Status 404 returned error can't find the container with id 002dee23d61a98ad749345d4d1997757afb7a079de688bd7529ca63ad40581cc Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.341724 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-254hj" event={"ID":"b8f94af6-8eda-46ff-be13-dc11b2f52790","Type":"ContainerStarted","Data":"4fab8ceeef55c6fea097a95f4f116c2edba112b3163e825235f450e332f97f26"} Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.344372 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1291-account-create-zsqws" Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.344462 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1291-account-create-zsqws" event={"ID":"7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d","Type":"ContainerDied","Data":"7e831f61a6ade636ecafedbd0c557d257fa4b288ebdaf90b3d8ec6df2c961abb"} Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.344560 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e831f61a6ade636ecafedbd0c557d257fa4b288ebdaf90b3d8ec6df2c961abb" Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.348265 4739 generic.go:334] "Generic (PLEG): container finished" podID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerID="8905234b68165c745cb7a6cab6039f7ab0e8e784b6cd5887243944a9c991a92b" exitCode=0 Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.348316 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" event={"ID":"841ddc2d-90ad-446e-a869-fe02ec8ce059","Type":"ContainerDied","Data":"8905234b68165c745cb7a6cab6039f7ab0e8e784b6cd5887243944a9c991a92b"} Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.348336 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" event={"ID":"841ddc2d-90ad-446e-a869-fe02ec8ce059","Type":"ContainerStarted","Data":"002dee23d61a98ad749345d4d1997757afb7a079de688bd7529ca63ad40581cc"} Oct 08 22:07:24 crc kubenswrapper[4739]: I1008 22:07:24.384711 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-254hj" podStartSLOduration=2.332117568 podStartE2EDuration="16.384684154s" podCreationTimestamp="2025-10-08 22:07:08 +0000 UTC" firstStartedPulling="2025-10-08 22:07:09.502595281 +0000 UTC m=+1129.327981041" lastFinishedPulling="2025-10-08 22:07:23.555161877 +0000 UTC m=+1143.380547627" observedRunningTime="2025-10-08 22:07:24.372803751 +0000 UTC m=+1144.198189511" watchObservedRunningTime="2025-10-08 22:07:24.384684154 +0000 UTC m=+1144.210069944" Oct 08 22:07:25 crc kubenswrapper[4739]: I1008 22:07:25.362757 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" event={"ID":"841ddc2d-90ad-446e-a869-fe02ec8ce059","Type":"ContainerStarted","Data":"325aa857c00b6f969fa4f65fed517b578e88c8f1204019a345845ffb1c7fd252"} Oct 08 22:07:25 crc kubenswrapper[4739]: I1008 22:07:25.393574 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" podStartSLOduration=9.393551356 podStartE2EDuration="9.393551356s" podCreationTimestamp="2025-10-08 22:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:25.387273822 +0000 UTC m=+1145.212659602" watchObservedRunningTime="2025-10-08 22:07:25.393551356 +0000 UTC m=+1145.218937116" Oct 08 22:07:26 crc kubenswrapper[4739]: I1008 22:07:26.375366 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:29 crc kubenswrapper[4739]: I1008 22:07:29.725527 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 22:07:29 crc kubenswrapper[4739]: I1008 22:07:29.789375 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.135182 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h86jm"] Oct 08 22:07:30 crc kubenswrapper[4739]: E1008 22:07:30.135750 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.135766 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: E1008 22:07:30.135799 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec4887c-3f0f-4ad2-a187-ecf79caa824f" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.135808 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec4887c-3f0f-4ad2-a187-ecf79caa824f" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.135981 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.135997 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec4887c-3f0f-4ad2-a187-ecf79caa824f" containerName="mariadb-account-create" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.136525 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.155268 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h86jm"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.246081 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-t45jh"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.247446 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.272439 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t45jh"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.288040 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97qs\" (UniqueName: \"kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs\") pod \"cinder-db-create-h86jm\" (UID: \"25582fa4-e567-498f-ae5c-b05ebb260645\") " pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.389127 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v97qs\" (UniqueName: \"kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs\") pod \"cinder-db-create-h86jm\" (UID: \"25582fa4-e567-498f-ae5c-b05ebb260645\") " pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.389238 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bhk\" (UniqueName: \"kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk\") pod \"barbican-db-create-t45jh\" (UID: \"06663718-278d-4ac8-b6a5-9a6141dc0f78\") " pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.417665 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xnxrs"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.419005 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.423830 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q58bw" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.424118 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.424318 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.424484 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.427940 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97qs\" (UniqueName: \"kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs\") pod \"cinder-db-create-h86jm\" (UID: \"25582fa4-e567-498f-ae5c-b05ebb260645\") " pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.441498 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lsg8k"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.442666 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.454121 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.490984 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bhk\" (UniqueName: \"kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk\") pod \"barbican-db-create-t45jh\" (UID: \"06663718-278d-4ac8-b6a5-9a6141dc0f78\") " pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.519729 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xnxrs"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.535474 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bhk\" (UniqueName: \"kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk\") pod \"barbican-db-create-t45jh\" (UID: \"06663718-278d-4ac8-b6a5-9a6141dc0f78\") " pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.575314 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.580903 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lsg8k"] Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.593903 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.593992 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rphnx\" (UniqueName: \"kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx\") pod \"neutron-db-create-lsg8k\" (UID: \"33840e62-c56d-41a7-918d-377ce6e86ffe\") " pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.594033 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7msp\" (UniqueName: \"kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.594072 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.696601 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7msp\" (UniqueName: \"kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.696682 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.696762 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.696804 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rphnx\" (UniqueName: \"kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx\") pod \"neutron-db-create-lsg8k\" (UID: \"33840e62-c56d-41a7-918d-377ce6e86ffe\") " pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.704494 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.711069 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.722231 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7msp\" (UniqueName: \"kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp\") pod \"keystone-db-sync-xnxrs\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.732198 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rphnx\" (UniqueName: \"kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx\") pod \"neutron-db-create-lsg8k\" (UID: \"33840e62-c56d-41a7-918d-377ce6e86ffe\") " pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.783349 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:30 crc kubenswrapper[4739]: I1008 22:07:30.792612 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.008600 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h86jm"] Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.055397 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xnxrs"] Oct 08 22:07:31 crc kubenswrapper[4739]: W1008 22:07:31.060637 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b37b73_be70_445c_9515_c3c467ce9bf9.slice/crio-3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1 WatchSource:0}: Error finding container 3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1: Status 404 returned error can't find the container with id 3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1 Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.093752 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t45jh"] Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.333421 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lsg8k"] Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.434232 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lsg8k" event={"ID":"33840e62-c56d-41a7-918d-377ce6e86ffe","Type":"ContainerStarted","Data":"1e0cfab6ce391bc6c680bad96491fd733afd0b6cab904848f81b58f6de591715"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.438987 4739 generic.go:334] "Generic (PLEG): container finished" podID="06663718-278d-4ac8-b6a5-9a6141dc0f78" containerID="06fea3b454341d8f5561cabc5f21fff6dacdb61ed4c3f7b9dd60fb90eb29fc0b" exitCode=0 Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.439090 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t45jh" event={"ID":"06663718-278d-4ac8-b6a5-9a6141dc0f78","Type":"ContainerDied","Data":"06fea3b454341d8f5561cabc5f21fff6dacdb61ed4c3f7b9dd60fb90eb29fc0b"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.439138 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t45jh" event={"ID":"06663718-278d-4ac8-b6a5-9a6141dc0f78","Type":"ContainerStarted","Data":"5066173ef4046972575cb7bc953e5fa457df186dc3d321081cdd3d295745c572"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.447820 4739 generic.go:334] "Generic (PLEG): container finished" podID="25582fa4-e567-498f-ae5c-b05ebb260645" containerID="61bb7913c178e8bcb7aaefaeb4c1d09cb5216243f9c2fd38a870b3d36c7b4076" exitCode=0 Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.448238 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h86jm" event={"ID":"25582fa4-e567-498f-ae5c-b05ebb260645","Type":"ContainerDied","Data":"61bb7913c178e8bcb7aaefaeb4c1d09cb5216243f9c2fd38a870b3d36c7b4076"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.448262 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h86jm" event={"ID":"25582fa4-e567-498f-ae5c-b05ebb260645","Type":"ContainerStarted","Data":"1d27eff0f1b6a54ec438046024dfc9ead8ebc7db7841975a451a1d014279288e"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.452307 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xnxrs" event={"ID":"30b37b73-be70-445c-9515-c3c467ce9bf9","Type":"ContainerStarted","Data":"3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1"} Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.872355 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.963289 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:07:31 crc kubenswrapper[4739]: I1008 22:07:31.963528 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-mbp55" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="dnsmasq-dns" containerID="cri-o://e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867" gracePeriod=10 Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.459069 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.463576 4739 generic.go:334] "Generic (PLEG): container finished" podID="b8f94af6-8eda-46ff-be13-dc11b2f52790" containerID="4fab8ceeef55c6fea097a95f4f116c2edba112b3163e825235f450e332f97f26" exitCode=0 Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.463642 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-254hj" event={"ID":"b8f94af6-8eda-46ff-be13-dc11b2f52790","Type":"ContainerDied","Data":"4fab8ceeef55c6fea097a95f4f116c2edba112b3163e825235f450e332f97f26"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.472248 4739 generic.go:334] "Generic (PLEG): container finished" podID="33840e62-c56d-41a7-918d-377ce6e86ffe" containerID="66b4a62c9c64a2bc503f2dbb75bbcf99a260bd0f616fd1709f2984c414b52c6c" exitCode=0 Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.472400 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lsg8k" event={"ID":"33840e62-c56d-41a7-918d-377ce6e86ffe","Type":"ContainerDied","Data":"66b4a62c9c64a2bc503f2dbb75bbcf99a260bd0f616fd1709f2984c414b52c6c"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.478104 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad5158ad-8718-400f-9d89-a28f901a953f" containerID="e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867" exitCode=0 Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.478421 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mbp55" event={"ID":"ad5158ad-8718-400f-9d89-a28f901a953f","Type":"ContainerDied","Data":"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.478523 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mbp55" event={"ID":"ad5158ad-8718-400f-9d89-a28f901a953f","Type":"ContainerDied","Data":"3cdb329208f223f940d2731ad98967539eeef1e85f9f9e613b7da9637ada0369"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.478557 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mbp55" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.478574 4739 scope.go:117] "RemoveContainer" containerID="e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.591157 4739 scope.go:117] "RemoveContainer" containerID="b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.634556 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config\") pod \"ad5158ad-8718-400f-9d89-a28f901a953f\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.634616 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc\") pod \"ad5158ad-8718-400f-9d89-a28f901a953f\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.634660 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6snc\" (UniqueName: \"kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc\") pod \"ad5158ad-8718-400f-9d89-a28f901a953f\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.634725 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb\") pod \"ad5158ad-8718-400f-9d89-a28f901a953f\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.634836 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb\") pod \"ad5158ad-8718-400f-9d89-a28f901a953f\" (UID: \"ad5158ad-8718-400f-9d89-a28f901a953f\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.638392 4739 scope.go:117] "RemoveContainer" containerID="e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.641086 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc" (OuterVolumeSpecName: "kube-api-access-v6snc") pod "ad5158ad-8718-400f-9d89-a28f901a953f" (UID: "ad5158ad-8718-400f-9d89-a28f901a953f"). InnerVolumeSpecName "kube-api-access-v6snc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: E1008 22:07:32.652442 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867\": container with ID starting with e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867 not found: ID does not exist" containerID="e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.652474 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867"} err="failed to get container status \"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867\": rpc error: code = NotFound desc = could not find container \"e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867\": container with ID starting with e840c7c629d74db15604a004ab22b819d662c8214fe1ff5ed314c53bb20c9867 not found: ID does not exist" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.652504 4739 scope.go:117] "RemoveContainer" containerID="b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71" Oct 08 22:07:33 crc kubenswrapper[4739]: E1008 22:07:32.659714 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71\": container with ID starting with b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71 not found: ID does not exist" containerID="b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.659749 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71"} err="failed to get container status \"b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71\": rpc error: code = NotFound desc = could not find container \"b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71\": container with ID starting with b31876e524354f05a13e08ef5259abfe85908bb2e843d0e15c951f631624fb71 not found: ID does not exist" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.694636 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad5158ad-8718-400f-9d89-a28f901a953f" (UID: "ad5158ad-8718-400f-9d89-a28f901a953f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.696870 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad5158ad-8718-400f-9d89-a28f901a953f" (UID: "ad5158ad-8718-400f-9d89-a28f901a953f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.697240 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad5158ad-8718-400f-9d89-a28f901a953f" (UID: "ad5158ad-8718-400f-9d89-a28f901a953f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.718104 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config" (OuterVolumeSpecName: "config") pod "ad5158ad-8718-400f-9d89-a28f901a953f" (UID: "ad5158ad-8718-400f-9d89-a28f901a953f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.736289 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.736311 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.736320 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.736330 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6snc\" (UniqueName: \"kubernetes.io/projected/ad5158ad-8718-400f-9d89-a28f901a953f-kube-api-access-v6snc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.736341 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5158ad-8718-400f-9d89-a28f901a953f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.831983 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.842628 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mbp55"] Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:32.909251 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.042270 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65bhk\" (UniqueName: \"kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk\") pod \"06663718-278d-4ac8-b6a5-9a6141dc0f78\" (UID: \"06663718-278d-4ac8-b6a5-9a6141dc0f78\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.048688 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk" (OuterVolumeSpecName: "kube-api-access-65bhk") pod "06663718-278d-4ac8-b6a5-9a6141dc0f78" (UID: "06663718-278d-4ac8-b6a5-9a6141dc0f78"). InnerVolumeSpecName "kube-api-access-65bhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.144374 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65bhk\" (UniqueName: \"kubernetes.io/projected/06663718-278d-4ac8-b6a5-9a6141dc0f78-kube-api-access-65bhk\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.225574 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.355446 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v97qs\" (UniqueName: \"kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs\") pod \"25582fa4-e567-498f-ae5c-b05ebb260645\" (UID: \"25582fa4-e567-498f-ae5c-b05ebb260645\") " Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.361183 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs" (OuterVolumeSpecName: "kube-api-access-v97qs") pod "25582fa4-e567-498f-ae5c-b05ebb260645" (UID: "25582fa4-e567-498f-ae5c-b05ebb260645"). InnerVolumeSpecName "kube-api-access-v97qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.458034 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v97qs\" (UniqueName: \"kubernetes.io/projected/25582fa4-e567-498f-ae5c-b05ebb260645-kube-api-access-v97qs\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.489540 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t45jh" event={"ID":"06663718-278d-4ac8-b6a5-9a6141dc0f78","Type":"ContainerDied","Data":"5066173ef4046972575cb7bc953e5fa457df186dc3d321081cdd3d295745c572"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.489586 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5066173ef4046972575cb7bc953e5fa457df186dc3d321081cdd3d295745c572" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.489601 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t45jh" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.491586 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h86jm" event={"ID":"25582fa4-e567-498f-ae5c-b05ebb260645","Type":"ContainerDied","Data":"1d27eff0f1b6a54ec438046024dfc9ead8ebc7db7841975a451a1d014279288e"} Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.491677 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d27eff0f1b6a54ec438046024dfc9ead8ebc7db7841975a451a1d014279288e" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.491750 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h86jm" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.839105 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" path="/var/lib/kubelet/pods/ad5158ad-8718-400f-9d89-a28f901a953f/volumes" Oct 08 22:07:33 crc kubenswrapper[4739]: I1008 22:07:33.907786 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.111346 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rphnx\" (UniqueName: \"kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx\") pod \"33840e62-c56d-41a7-918d-377ce6e86ffe\" (UID: \"33840e62-c56d-41a7-918d-377ce6e86ffe\") " Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.114910 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx" (OuterVolumeSpecName: "kube-api-access-rphnx") pod "33840e62-c56d-41a7-918d-377ce6e86ffe" (UID: "33840e62-c56d-41a7-918d-377ce6e86ffe"). InnerVolumeSpecName "kube-api-access-rphnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.214727 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rphnx\" (UniqueName: \"kubernetes.io/projected/33840e62-c56d-41a7-918d-377ce6e86ffe-kube-api-access-rphnx\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.504081 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lsg8k" Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.504108 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lsg8k" event={"ID":"33840e62-c56d-41a7-918d-377ce6e86ffe","Type":"ContainerDied","Data":"1e0cfab6ce391bc6c680bad96491fd733afd0b6cab904848f81b58f6de591715"} Oct 08 22:07:34 crc kubenswrapper[4739]: I1008 22:07:34.504226 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0cfab6ce391bc6c680bad96491fd733afd0b6cab904848f81b58f6de591715" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.211670 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-254hj" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.368525 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data\") pod \"b8f94af6-8eda-46ff-be13-dc11b2f52790\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.370863 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j85fz\" (UniqueName: \"kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz\") pod \"b8f94af6-8eda-46ff-be13-dc11b2f52790\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.370916 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle\") pod \"b8f94af6-8eda-46ff-be13-dc11b2f52790\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.370979 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data\") pod \"b8f94af6-8eda-46ff-be13-dc11b2f52790\" (UID: \"b8f94af6-8eda-46ff-be13-dc11b2f52790\") " Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.372822 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b8f94af6-8eda-46ff-be13-dc11b2f52790" (UID: "b8f94af6-8eda-46ff-be13-dc11b2f52790"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.376535 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz" (OuterVolumeSpecName: "kube-api-access-j85fz") pod "b8f94af6-8eda-46ff-be13-dc11b2f52790" (UID: "b8f94af6-8eda-46ff-be13-dc11b2f52790"). InnerVolumeSpecName "kube-api-access-j85fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.399294 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f94af6-8eda-46ff-be13-dc11b2f52790" (UID: "b8f94af6-8eda-46ff-be13-dc11b2f52790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.427782 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data" (OuterVolumeSpecName: "config-data") pod "b8f94af6-8eda-46ff-be13-dc11b2f52790" (UID: "b8f94af6-8eda-46ff-be13-dc11b2f52790"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.473249 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.473292 4739 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.473312 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j85fz\" (UniqueName: \"kubernetes.io/projected/b8f94af6-8eda-46ff-be13-dc11b2f52790-kube-api-access-j85fz\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.473330 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f94af6-8eda-46ff-be13-dc11b2f52790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.534961 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-254hj" event={"ID":"b8f94af6-8eda-46ff-be13-dc11b2f52790","Type":"ContainerDied","Data":"9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1"} Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.535001 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-254hj" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.535017 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9888284d789833cb9f67c969b1527ffd5cdedf9801f571f80b5e75cd996c81a1" Oct 08 22:07:37 crc kubenswrapper[4739]: I1008 22:07:37.536902 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xnxrs" event={"ID":"30b37b73-be70-445c-9515-c3c467ce9bf9","Type":"ContainerStarted","Data":"87d921c5a5e28362e5f9afe03e6e2b9602216697772fca22e8114178ec2bd7e4"} Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.263891 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xnxrs" podStartSLOduration=2.301800828 podStartE2EDuration="8.263804929s" podCreationTimestamp="2025-10-08 22:07:30 +0000 UTC" firstStartedPulling="2025-10-08 22:07:31.063839984 +0000 UTC m=+1150.889225734" lastFinishedPulling="2025-10-08 22:07:37.025844065 +0000 UTC m=+1156.851229835" observedRunningTime="2025-10-08 22:07:37.559717112 +0000 UTC m=+1157.385102892" watchObservedRunningTime="2025-10-08 22:07:38.263804929 +0000 UTC m=+1158.089190719" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.747794 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748082 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="dnsmasq-dns" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748100 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="dnsmasq-dns" Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748111 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="init" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748117 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="init" Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748131 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25582fa4-e567-498f-ae5c-b05ebb260645" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748138 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25582fa4-e567-498f-ae5c-b05ebb260645" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748164 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06663718-278d-4ac8-b6a5-9a6141dc0f78" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748170 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="06663718-278d-4ac8-b6a5-9a6141dc0f78" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748185 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33840e62-c56d-41a7-918d-377ce6e86ffe" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748191 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="33840e62-c56d-41a7-918d-377ce6e86ffe" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: E1008 22:07:38.748203 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f94af6-8eda-46ff-be13-dc11b2f52790" containerName="glance-db-sync" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748209 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f94af6-8eda-46ff-be13-dc11b2f52790" containerName="glance-db-sync" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748361 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5158ad-8718-400f-9d89-a28f901a953f" containerName="dnsmasq-dns" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748375 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f94af6-8eda-46ff-be13-dc11b2f52790" containerName="glance-db-sync" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748387 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="33840e62-c56d-41a7-918d-377ce6e86ffe" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748396 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="25582fa4-e567-498f-ae5c-b05ebb260645" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.748408 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="06663718-278d-4ac8-b6a5-9a6141dc0f78" containerName="mariadb-database-create" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.749223 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.759203 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796460 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796512 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796542 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796569 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796598 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4z7\" (UniqueName: \"kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.796647 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898225 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898357 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898414 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898475 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898531 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.898585 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4z7\" (UniqueName: \"kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.899330 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.899376 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.900353 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.900836 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.901317 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:38 crc kubenswrapper[4739]: I1008 22:07:38.934169 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4z7\" (UniqueName: \"kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7\") pod \"dnsmasq-dns-7ff5475cc9-vxd4l\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:39 crc kubenswrapper[4739]: I1008 22:07:39.068492 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:39 crc kubenswrapper[4739]: I1008 22:07:39.551559 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:39 crc kubenswrapper[4739]: W1008 22:07:39.568529 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61bc83ea_c8e0_40aa_a7b3_143cba264f34.slice/crio-984207230cd1bbbd71754d2dd4830c7b38869b6643bf711bc21b7d63833a1e49 WatchSource:0}: Error finding container 984207230cd1bbbd71754d2dd4830c7b38869b6643bf711bc21b7d63833a1e49: Status 404 returned error can't find the container with id 984207230cd1bbbd71754d2dd4830c7b38869b6643bf711bc21b7d63833a1e49 Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.189763 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ad94-account-create-2wctp"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.191185 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.196385 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.211481 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ad94-account-create-2wctp"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.222845 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwvmv\" (UniqueName: \"kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv\") pod \"cinder-ad94-account-create-2wctp\" (UID: \"a5562904-ea97-4bdb-89e6-46d2c316f29c\") " pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.285195 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1624-account-create-qr8gs"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.286268 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.293384 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1624-account-create-qr8gs"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.295826 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.335772 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk67b\" (UniqueName: \"kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b\") pod \"barbican-1624-account-create-qr8gs\" (UID: \"55547779-fb79-423f-8af3-09c41da8b357\") " pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.335839 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwvmv\" (UniqueName: \"kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv\") pod \"cinder-ad94-account-create-2wctp\" (UID: \"a5562904-ea97-4bdb-89e6-46d2c316f29c\") " pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.367996 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwvmv\" (UniqueName: \"kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv\") pod \"cinder-ad94-account-create-2wctp\" (UID: \"a5562904-ea97-4bdb-89e6-46d2c316f29c\") " pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.437529 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk67b\" (UniqueName: \"kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b\") pod \"barbican-1624-account-create-qr8gs\" (UID: \"55547779-fb79-423f-8af3-09c41da8b357\") " pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.454019 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk67b\" (UniqueName: \"kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b\") pod \"barbican-1624-account-create-qr8gs\" (UID: \"55547779-fb79-423f-8af3-09c41da8b357\") " pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.526491 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.549636 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5994-account-create-cf2zj"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.550764 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.552781 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.557014 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5994-account-create-cf2zj"] Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.572831 4739 generic.go:334] "Generic (PLEG): container finished" podID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerID="e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0" exitCode=0 Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.572874 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" event={"ID":"61bc83ea-c8e0-40aa-a7b3-143cba264f34","Type":"ContainerDied","Data":"e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0"} Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.572898 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" event={"ID":"61bc83ea-c8e0-40aa-a7b3-143cba264f34","Type":"ContainerStarted","Data":"984207230cd1bbbd71754d2dd4830c7b38869b6643bf711bc21b7d63833a1e49"} Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.627176 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.647611 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsgm\" (UniqueName: \"kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm\") pod \"neutron-5994-account-create-cf2zj\" (UID: \"d74a95fb-685b-46c3-8727-2b87d78607a5\") " pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.749376 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snsgm\" (UniqueName: \"kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm\") pod \"neutron-5994-account-create-cf2zj\" (UID: \"d74a95fb-685b-46c3-8727-2b87d78607a5\") " pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.765930 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsgm\" (UniqueName: \"kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm\") pod \"neutron-5994-account-create-cf2zj\" (UID: \"d74a95fb-685b-46c3-8727-2b87d78607a5\") " pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:40 crc kubenswrapper[4739]: I1008 22:07:40.996910 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.034472 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ad94-account-create-2wctp"] Oct 08 22:07:41 crc kubenswrapper[4739]: W1008 22:07:41.048715 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5562904_ea97_4bdb_89e6_46d2c316f29c.slice/crio-c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80 WatchSource:0}: Error finding container c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80: Status 404 returned error can't find the container with id c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80 Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.174922 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1624-account-create-qr8gs"] Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.429090 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5994-account-create-cf2zj"] Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.585952 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5994-account-create-cf2zj" event={"ID":"d74a95fb-685b-46c3-8727-2b87d78607a5","Type":"ContainerStarted","Data":"9d787f5f3ce0e3029397af6a35696c37a92474ac1a193f6fae32c806963b94d9"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.587989 4739 generic.go:334] "Generic (PLEG): container finished" podID="30b37b73-be70-445c-9515-c3c467ce9bf9" containerID="87d921c5a5e28362e5f9afe03e6e2b9602216697772fca22e8114178ec2bd7e4" exitCode=0 Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.588079 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xnxrs" event={"ID":"30b37b73-be70-445c-9515-c3c467ce9bf9","Type":"ContainerDied","Data":"87d921c5a5e28362e5f9afe03e6e2b9602216697772fca22e8114178ec2bd7e4"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.590821 4739 generic.go:334] "Generic (PLEG): container finished" podID="a5562904-ea97-4bdb-89e6-46d2c316f29c" containerID="4b4879a81be993503d200383d3ec9c930dfac787a06552dcd7bb7c0c2df16dba" exitCode=0 Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.590872 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ad94-account-create-2wctp" event={"ID":"a5562904-ea97-4bdb-89e6-46d2c316f29c","Type":"ContainerDied","Data":"4b4879a81be993503d200383d3ec9c930dfac787a06552dcd7bb7c0c2df16dba"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.590918 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ad94-account-create-2wctp" event={"ID":"a5562904-ea97-4bdb-89e6-46d2c316f29c","Type":"ContainerStarted","Data":"c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.593704 4739 generic.go:334] "Generic (PLEG): container finished" podID="55547779-fb79-423f-8af3-09c41da8b357" containerID="71923097a6d8ba915352bd825bb242038db20087c1bde2ca7d54b389e0be526d" exitCode=0 Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.593756 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1624-account-create-qr8gs" event={"ID":"55547779-fb79-423f-8af3-09c41da8b357","Type":"ContainerDied","Data":"71923097a6d8ba915352bd825bb242038db20087c1bde2ca7d54b389e0be526d"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.593775 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1624-account-create-qr8gs" event={"ID":"55547779-fb79-423f-8af3-09c41da8b357","Type":"ContainerStarted","Data":"606b75a9991bc4af4d798934b455ca576b9835fc7d5011160719fabd0363ef38"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.596213 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" event={"ID":"61bc83ea-c8e0-40aa-a7b3-143cba264f34","Type":"ContainerStarted","Data":"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa"} Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.596831 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:41 crc kubenswrapper[4739]: I1008 22:07:41.638670 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" podStartSLOduration=3.638593901 podStartE2EDuration="3.638593901s" podCreationTimestamp="2025-10-08 22:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:41.636002637 +0000 UTC m=+1161.461388387" watchObservedRunningTime="2025-10-08 22:07:41.638593901 +0000 UTC m=+1161.463979651" Oct 08 22:07:42 crc kubenswrapper[4739]: I1008 22:07:42.605087 4739 generic.go:334] "Generic (PLEG): container finished" podID="d74a95fb-685b-46c3-8727-2b87d78607a5" containerID="6a1da5aac8382f24cd67defbce27b21b4bb7f3a8a050ab044b7776203829f4a4" exitCode=0 Oct 08 22:07:42 crc kubenswrapper[4739]: I1008 22:07:42.605186 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5994-account-create-cf2zj" event={"ID":"d74a95fb-685b-46c3-8727-2b87d78607a5","Type":"ContainerDied","Data":"6a1da5aac8382f24cd67defbce27b21b4bb7f3a8a050ab044b7776203829f4a4"} Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.068474 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.072715 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.081572 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.101890 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk67b\" (UniqueName: \"kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b\") pod \"55547779-fb79-423f-8af3-09c41da8b357\" (UID: \"55547779-fb79-423f-8af3-09c41da8b357\") " Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.101973 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle\") pod \"30b37b73-be70-445c-9515-c3c467ce9bf9\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.102105 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data\") pod \"30b37b73-be70-445c-9515-c3c467ce9bf9\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.102197 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7msp\" (UniqueName: \"kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp\") pod \"30b37b73-be70-445c-9515-c3c467ce9bf9\" (UID: \"30b37b73-be70-445c-9515-c3c467ce9bf9\") " Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.102300 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwvmv\" (UniqueName: \"kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv\") pod \"a5562904-ea97-4bdb-89e6-46d2c316f29c\" (UID: \"a5562904-ea97-4bdb-89e6-46d2c316f29c\") " Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.131911 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b" (OuterVolumeSpecName: "kube-api-access-dk67b") pod "55547779-fb79-423f-8af3-09c41da8b357" (UID: "55547779-fb79-423f-8af3-09c41da8b357"). InnerVolumeSpecName "kube-api-access-dk67b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.132565 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv" (OuterVolumeSpecName: "kube-api-access-kwvmv") pod "a5562904-ea97-4bdb-89e6-46d2c316f29c" (UID: "a5562904-ea97-4bdb-89e6-46d2c316f29c"). InnerVolumeSpecName "kube-api-access-kwvmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.135363 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp" (OuterVolumeSpecName: "kube-api-access-z7msp") pod "30b37b73-be70-445c-9515-c3c467ce9bf9" (UID: "30b37b73-be70-445c-9515-c3c467ce9bf9"). InnerVolumeSpecName "kube-api-access-z7msp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.137441 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30b37b73-be70-445c-9515-c3c467ce9bf9" (UID: "30b37b73-be70-445c-9515-c3c467ce9bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.180697 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data" (OuterVolumeSpecName: "config-data") pod "30b37b73-be70-445c-9515-c3c467ce9bf9" (UID: "30b37b73-be70-445c-9515-c3c467ce9bf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.203474 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.203505 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7msp\" (UniqueName: \"kubernetes.io/projected/30b37b73-be70-445c-9515-c3c467ce9bf9-kube-api-access-z7msp\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.203516 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwvmv\" (UniqueName: \"kubernetes.io/projected/a5562904-ea97-4bdb-89e6-46d2c316f29c-kube-api-access-kwvmv\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.203529 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk67b\" (UniqueName: \"kubernetes.io/projected/55547779-fb79-423f-8af3-09c41da8b357-kube-api-access-dk67b\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.203540 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b37b73-be70-445c-9515-c3c467ce9bf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.625091 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1624-account-create-qr8gs" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.626915 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1624-account-create-qr8gs" event={"ID":"55547779-fb79-423f-8af3-09c41da8b357","Type":"ContainerDied","Data":"606b75a9991bc4af4d798934b455ca576b9835fc7d5011160719fabd0363ef38"} Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.627044 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="606b75a9991bc4af4d798934b455ca576b9835fc7d5011160719fabd0363ef38" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.630501 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xnxrs" event={"ID":"30b37b73-be70-445c-9515-c3c467ce9bf9","Type":"ContainerDied","Data":"3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1"} Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.630544 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edd69331199b54f6ae15d3498a557633b1fc61176486640a1b3c9ba8578e6f1" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.630516 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xnxrs" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.643644 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ad94-account-create-2wctp" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.644557 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ad94-account-create-2wctp" event={"ID":"a5562904-ea97-4bdb-89e6-46d2c316f29c","Type":"ContainerDied","Data":"c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80"} Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.644594 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ba0cbe9eb6af7589868a0fd7b4d2cd239facd79decc49ce13240189856fe80" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.895272 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.895773 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="dnsmasq-dns" containerID="cri-o://40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa" gracePeriod=10 Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.921435 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-prlqk"] Oct 08 22:07:43 crc kubenswrapper[4739]: E1008 22:07:43.921949 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5562904-ea97-4bdb-89e6-46d2c316f29c" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.921971 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5562904-ea97-4bdb-89e6-46d2c316f29c" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: E1008 22:07:43.921988 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b37b73-be70-445c-9515-c3c467ce9bf9" containerName="keystone-db-sync" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.921996 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b37b73-be70-445c-9515-c3c467ce9bf9" containerName="keystone-db-sync" Oct 08 22:07:43 crc kubenswrapper[4739]: E1008 22:07:43.922016 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55547779-fb79-423f-8af3-09c41da8b357" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.922023 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="55547779-fb79-423f-8af3-09c41da8b357" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.922245 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="55547779-fb79-423f-8af3-09c41da8b357" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.922278 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5562904-ea97-4bdb-89e6-46d2c316f29c" containerName="mariadb-account-create" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.922293 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b37b73-be70-445c-9515-c3c467ce9bf9" containerName="keystone-db-sync" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.926120 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.942722 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.954559 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.954747 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q58bw" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.954793 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.962546 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-prlqk"] Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.970870 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb"] Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.972721 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:43 crc kubenswrapper[4739]: I1008 22:07:43.993064 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119040 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119098 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119137 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119198 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119244 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119265 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119292 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119317 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rm2d\" (UniqueName: \"kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119344 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119371 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119385 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdw7\" (UniqueName: \"kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.119427 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.198127 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.210655 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb"] Oct 08 22:07:44 crc kubenswrapper[4739]: E1008 22:07:44.211883 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-fzdw7 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" podUID="bf5ec5fa-4ce0-4e85-994e-6c826520482b" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222481 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222524 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdw7\" (UniqueName: \"kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222582 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222616 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222634 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222664 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222688 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222726 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222743 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222767 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222794 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rm2d\" (UniqueName: \"kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.222821 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.225561 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.226068 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.226971 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.230300 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.235767 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:07:44 crc kubenswrapper[4739]: E1008 22:07:44.236299 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74a95fb-685b-46c3-8727-2b87d78607a5" containerName="mariadb-account-create" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.236314 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74a95fb-685b-46c3-8727-2b87d78607a5" containerName="mariadb-account-create" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.236527 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74a95fb-685b-46c3-8727-2b87d78607a5" containerName="mariadb-account-create" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.239558 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.241292 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.247832 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-674m2"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.249172 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.250361 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.250871 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.251727 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.251904 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.256401 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.256707 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.257005 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.257401 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.257685 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bskv6" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.257976 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.281136 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rm2d\" (UniqueName: \"kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d\") pod \"keystone-bootstrap-prlqk\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.281233 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.282947 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.290297 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.296344 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdw7\" (UniqueName: \"kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7\") pod \"dnsmasq-dns-5c5cc7c5ff-8vvpb\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.319574 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-674m2"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.324376 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snsgm\" (UniqueName: \"kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm\") pod \"d74a95fb-685b-46c3-8727-2b87d78607a5\" (UID: \"d74a95fb-685b-46c3-8727-2b87d78607a5\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.339341 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm" (OuterVolumeSpecName: "kube-api-access-snsgm") pod "d74a95fb-685b-46c3-8727-2b87d78607a5" (UID: "d74a95fb-685b-46c3-8727-2b87d78607a5"). InnerVolumeSpecName "kube-api-access-snsgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.339992 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.370837 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.427855 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.431541 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxw9f\" (UniqueName: \"kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.431739 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cf86\" (UniqueName: \"kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.431853 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcr4\" (UniqueName: \"kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432420 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432501 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432526 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432545 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432567 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432621 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432672 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432824 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432857 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432916 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432953 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432971 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.432994 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.433092 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snsgm\" (UniqueName: \"kubernetes.io/projected/d74a95fb-685b-46c3-8727-2b87d78607a5-kube-api-access-snsgm\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.536857 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.536985 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxw9f\" (UniqueName: \"kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537031 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cf86\" (UniqueName: \"kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537094 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcr4\" (UniqueName: \"kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537122 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537183 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537211 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537242 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537274 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537304 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537341 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537818 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537863 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537904 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537933 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537948 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.537969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.538085 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.538369 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.539265 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.539820 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.540442 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.541484 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.542834 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.542896 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.543269 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.543762 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.544012 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.547696 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.547810 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.550172 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.559908 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.560036 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.560759 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cf86\" (UniqueName: \"kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86\") pod \"placement-db-sync-674m2\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.570249 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxw9f\" (UniqueName: \"kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f\") pod \"ceilometer-0\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.571281 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcr4\" (UniqueName: \"kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4\") pod \"dnsmasq-dns-8b5c85b87-f5djz\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.654741 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5994-account-create-cf2zj" event={"ID":"d74a95fb-685b-46c3-8727-2b87d78607a5","Type":"ContainerDied","Data":"9d787f5f3ce0e3029397af6a35696c37a92474ac1a193f6fae32c806963b94d9"} Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.654785 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d787f5f3ce0e3029397af6a35696c37a92474ac1a193f6fae32c806963b94d9" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.654873 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5994-account-create-cf2zj" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.659675 4739 generic.go:334] "Generic (PLEG): container finished" podID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerID="40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa" exitCode=0 Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.659894 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.660864 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.661468 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" event={"ID":"61bc83ea-c8e0-40aa-a7b3-143cba264f34","Type":"ContainerDied","Data":"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa"} Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.661557 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-vxd4l" event={"ID":"61bc83ea-c8e0-40aa-a7b3-143cba264f34","Type":"ContainerDied","Data":"984207230cd1bbbd71754d2dd4830c7b38869b6643bf711bc21b7d63833a1e49"} Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.661623 4739 scope.go:117] "RemoveContainer" containerID="40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.673593 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.706424 4739 scope.go:117] "RemoveContainer" containerID="e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.715992 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.723080 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-674m2" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.732518 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.734809 4739 scope.go:117] "RemoveContainer" containerID="40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa" Oct 08 22:07:44 crc kubenswrapper[4739]: E1008 22:07:44.735316 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa\": container with ID starting with 40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa not found: ID does not exist" containerID="40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.735362 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa"} err="failed to get container status \"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa\": rpc error: code = NotFound desc = could not find container \"40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa\": container with ID starting with 40818b66204f6b9b04868a5325e7c877583ff6d9eb66364ac7f9ed9b7318e7fa not found: ID does not exist" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.735392 4739 scope.go:117] "RemoveContainer" containerID="e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741498 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741565 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741687 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4z7\" (UniqueName: \"kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741730 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741763 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.741893 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config\") pod \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\" (UID: \"61bc83ea-c8e0-40aa-a7b3-143cba264f34\") " Oct 08 22:07:44 crc kubenswrapper[4739]: E1008 22:07:44.743625 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0\": container with ID starting with e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0 not found: ID does not exist" containerID="e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.743716 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0"} err="failed to get container status \"e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0\": rpc error: code = NotFound desc = could not find container \"e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0\": container with ID starting with e7d4bed79f4ed5ef8de264d05f1614289192f408309ce885afa415b23938b9e0 not found: ID does not exist" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.749308 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7" (OuterVolumeSpecName: "kube-api-access-rg4z7") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "kube-api-access-rg4z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.809359 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config" (OuterVolumeSpecName: "config") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.810118 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.816338 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.823251 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-prlqk"] Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.825387 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.825411 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61bc83ea-c8e0-40aa-a7b3-143cba264f34" (UID: "61bc83ea-c8e0-40aa-a7b3-143cba264f34"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: W1008 22:07:44.830183 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa75c4b_8e80_4dc3_a46a_63419b1e46c7.slice/crio-0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3 WatchSource:0}: Error finding container 0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3: Status 404 returned error can't find the container with id 0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3 Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.843286 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.843626 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.843881 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzdw7\" (UniqueName: \"kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.844019 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.844198 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.845541 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb\") pod \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\" (UID: \"bf5ec5fa-4ce0-4e85-994e-6c826520482b\") " Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.844576 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.844733 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.845375 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config" (OuterVolumeSpecName: "config") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.845402 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.846444 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849225 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7" (OuterVolumeSpecName: "kube-api-access-fzdw7") pod "bf5ec5fa-4ce0-4e85-994e-6c826520482b" (UID: "bf5ec5fa-4ce0-4e85-994e-6c826520482b"). InnerVolumeSpecName "kube-api-access-fzdw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849618 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4z7\" (UniqueName: \"kubernetes.io/projected/61bc83ea-c8e0-40aa-a7b3-143cba264f34-kube-api-access-rg4z7\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849635 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849644 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849654 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849665 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849675 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849685 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849694 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849703 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf5ec5fa-4ce0-4e85-994e-6c826520482b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849712 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.849719 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bc83ea-c8e0-40aa-a7b3-143cba264f34-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:44 crc kubenswrapper[4739]: I1008 22:07:44.951591 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzdw7\" (UniqueName: \"kubernetes.io/projected/bf5ec5fa-4ce0-4e85-994e-6c826520482b-kube-api-access-fzdw7\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.000582 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.019219 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-vxd4l"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.050334 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:45 crc kubenswrapper[4739]: E1008 22:07:45.051265 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="dnsmasq-dns" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.051283 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="dnsmasq-dns" Oct 08 22:07:45 crc kubenswrapper[4739]: E1008 22:07:45.051309 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="init" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.051315 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="init" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.051535 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" containerName="dnsmasq-dns" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.054477 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.062575 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.071014 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.071508 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zhb2c" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.071654 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.077606 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.104832 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.106357 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.109278 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.110115 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.161283 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.260944 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.260984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261005 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqm5f\" (UniqueName: \"kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261024 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261046 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvjnd\" (UniqueName: \"kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261074 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261110 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261158 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261175 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261199 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261215 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261233 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261252 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261280 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261296 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.261334 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.295187 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365583 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvjnd\" (UniqueName: \"kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365644 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365698 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365734 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365752 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365766 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365781 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365799 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365818 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365848 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365862 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365899 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365918 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365934 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365952 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqm5f\" (UniqueName: \"kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.365967 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.366447 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.367238 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.367950 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.379855 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.382794 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.390578 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.393243 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.402497 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.407799 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.416754 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.419191 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.419477 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.420514 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.423514 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.429988 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvjnd\" (UniqueName: \"kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.435772 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqm5f\" (UniqueName: \"kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.450505 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.455319 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.502868 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vvmhx"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.506853 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.509792 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q6fzs" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.509973 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.510118 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.537965 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.562530 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.564352 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-674m2"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571358 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29mk\" (UniqueName: \"kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571408 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571437 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571482 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571511 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.571540 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.593830 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vvmhx"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673442 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673502 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673536 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673635 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29mk\" (UniqueName: \"kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673659 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.673685 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.676624 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.679323 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerStarted","Data":"a74a3f8f1ed2465766d134c4260414baf3f2db3973f061d25e25714b0f2c6e14"} Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.680125 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.681103 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.681515 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.681612 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q5vhf"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.682331 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.682843 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.687919 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.688230 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jgmrn" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.689459 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" event={"ID":"3acc7290-6140-4870-a0d7-3bed7ac2b601","Type":"ContainerStarted","Data":"996b62a3f42ea9a67bd41116670c0c2219eb143a77e7dfc52d2a18c3f7037a55"} Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.699497 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-674m2" event={"ID":"b6fd1196-cd2f-4951-ad50-5dc17dac4aac","Type":"ContainerStarted","Data":"97b158247e0eb50db4dc5be226887f0036a63053d5388096d0e6368bdb227be3"} Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.700425 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q5vhf"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.714615 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.716024 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-prlqk" event={"ID":"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7","Type":"ContainerStarted","Data":"33ba149770811cf8f6d0b06f8ac779b8ec2327cefbd40a418b21fc7450b41ede"} Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.716076 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-prlqk" event={"ID":"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7","Type":"ContainerStarted","Data":"0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3"} Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.718998 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.721893 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29mk\" (UniqueName: \"kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk\") pod \"cinder-db-sync-vvmhx\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.770097 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-prlqk" podStartSLOduration=2.7700686169999997 podStartE2EDuration="2.770068617s" podCreationTimestamp="2025-10-08 22:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:45.744505256 +0000 UTC m=+1165.569891006" watchObservedRunningTime="2025-10-08 22:07:45.770068617 +0000 UTC m=+1165.595454367" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.846270 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwc5r\" (UniqueName: \"kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.846451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.846843 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.847727 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.886123 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61bc83ea-c8e0-40aa-a7b3-143cba264f34" path="/var/lib/kubelet/pods/61bc83ea-c8e0-40aa-a7b3-143cba264f34/volumes" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.887714 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.891287 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-8vvpb"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.898572 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b64hw"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.903072 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.904345 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b64hw"] Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.905588 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.906347 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.906385 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4z5dg" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.960056 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.960204 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwc5r\" (UniqueName: \"kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.960258 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.966790 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.967206 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:45 crc kubenswrapper[4739]: I1008 22:07:45.977592 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwc5r\" (UniqueName: \"kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r\") pod \"barbican-db-sync-q5vhf\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.011889 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.061297 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lff\" (UniqueName: \"kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.061636 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.061702 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.163023 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lff\" (UniqueName: \"kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.163091 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.163204 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.174924 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.206045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.209595 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lff\" (UniqueName: \"kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff\") pod \"neutron-db-sync-b64hw\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.234405 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b64hw" Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.328307 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.437298 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.479777 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vvmhx"] Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.496113 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.615982 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:46 crc kubenswrapper[4739]: W1008 22:07:46.626283 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8260021_8bdd_4550_8136_e28f19e51159.slice/crio-45185e485a6ded31ee5afced35430a9035533d66c6e45c7abc8fa2a945fb3953 WatchSource:0}: Error finding container 45185e485a6ded31ee5afced35430a9035533d66c6e45c7abc8fa2a945fb3953: Status 404 returned error can't find the container with id 45185e485a6ded31ee5afced35430a9035533d66c6e45c7abc8fa2a945fb3953 Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.710074 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q5vhf"] Oct 08 22:07:46 crc kubenswrapper[4739]: W1008 22:07:46.721384 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod836f20c4_8401_4a21_a541_0dbc92430484.slice/crio-ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6 WatchSource:0}: Error finding container ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6: Status 404 returned error can't find the container with id ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6 Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.727817 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerStarted","Data":"45185e485a6ded31ee5afced35430a9035533d66c6e45c7abc8fa2a945fb3953"} Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.728852 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vvmhx" event={"ID":"4da3e49a-b4ae-4375-893f-47d64b4eb0b5","Type":"ContainerStarted","Data":"185ce260bf266eccdbfd85ba074228ae9fe027e42f58c642b5fb887d97480db4"} Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.730232 4739 generic.go:334] "Generic (PLEG): container finished" podID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerID="b3bfa9a9e3a072bb275aeab0302013654f9915b28fb334f5fc6ad4758ea004c5" exitCode=0 Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.731746 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" event={"ID":"3acc7290-6140-4870-a0d7-3bed7ac2b601","Type":"ContainerDied","Data":"b3bfa9a9e3a072bb275aeab0302013654f9915b28fb334f5fc6ad4758ea004c5"} Oct 08 22:07:46 crc kubenswrapper[4739]: I1008 22:07:46.835230 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b64hw"] Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.594626 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:07:47 crc kubenswrapper[4739]: W1008 22:07:47.613927 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d8b032_6093_4a17_8ff4_1032b20847ca.slice/crio-8d92e5d9ceac1a5a1b2686ea0cb5e29c5568559e7e1ca4ab8e06b951560733fc WatchSource:0}: Error finding container 8d92e5d9ceac1a5a1b2686ea0cb5e29c5568559e7e1ca4ab8e06b951560733fc: Status 404 returned error can't find the container with id 8d92e5d9ceac1a5a1b2686ea0cb5e29c5568559e7e1ca4ab8e06b951560733fc Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.741447 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b64hw" event={"ID":"b1381344-e404-4d04-bd00-667cfc882bcc","Type":"ContainerStarted","Data":"e0caa62d8003ed5f6c166b455a89ac239b00aee43ad29b2ef54ce2ab075d3b0d"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.741489 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b64hw" event={"ID":"b1381344-e404-4d04-bd00-667cfc882bcc","Type":"ContainerStarted","Data":"1fcdc2f65ff30cc230a2449d1f05bf02d0943dc703fa23bcb25ec314c8aa8c30"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.744250 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q5vhf" event={"ID":"836f20c4-8401-4a21-a541-0dbc92430484","Type":"ContainerStarted","Data":"ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.749759 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerStarted","Data":"8d92e5d9ceac1a5a1b2686ea0cb5e29c5568559e7e1ca4ab8e06b951560733fc"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.763784 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerStarted","Data":"ff6192f35a9d0797a650513d137cb8f8962c7971f8a08212ac753598e7f5d929"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.767565 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" event={"ID":"3acc7290-6140-4870-a0d7-3bed7ac2b601","Type":"ContainerStarted","Data":"2648bf9f73403d265860e283a99f9bd21b7511fcc32aa436c08be2965917ee54"} Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.768497 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.789816 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b64hw" podStartSLOduration=2.78979704 podStartE2EDuration="2.78979704s" podCreationTimestamp="2025-10-08 22:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:47.754266786 +0000 UTC m=+1167.579652546" watchObservedRunningTime="2025-10-08 22:07:47.78979704 +0000 UTC m=+1167.615182780" Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.795716 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" podStartSLOduration=3.795698315 podStartE2EDuration="3.795698315s" podCreationTimestamp="2025-10-08 22:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:47.783236728 +0000 UTC m=+1167.608622488" watchObservedRunningTime="2025-10-08 22:07:47.795698315 +0000 UTC m=+1167.621084065" Oct 08 22:07:47 crc kubenswrapper[4739]: I1008 22:07:47.839562 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5ec5fa-4ce0-4e85-994e-6c826520482b" path="/var/lib/kubelet/pods/bf5ec5fa-4ce0-4e85-994e-6c826520482b/volumes" Oct 08 22:07:48 crc kubenswrapper[4739]: I1008 22:07:48.795604 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerStarted","Data":"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee"} Oct 08 22:07:48 crc kubenswrapper[4739]: I1008 22:07:48.803248 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerStarted","Data":"9412097d0762b9b176533b40373e0f1de0234ed006c93620516aaf9eb2fb49bb"} Oct 08 22:07:48 crc kubenswrapper[4739]: I1008 22:07:48.803412 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-log" containerID="cri-o://ff6192f35a9d0797a650513d137cb8f8962c7971f8a08212ac753598e7f5d929" gracePeriod=30 Oct 08 22:07:48 crc kubenswrapper[4739]: I1008 22:07:48.803739 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-httpd" containerID="cri-o://9412097d0762b9b176533b40373e0f1de0234ed006c93620516aaf9eb2fb49bb" gracePeriod=30 Oct 08 22:07:48 crc kubenswrapper[4739]: I1008 22:07:48.833944 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.833930712 podStartE2EDuration="5.833930712s" podCreationTimestamp="2025-10-08 22:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:07:48.832717212 +0000 UTC m=+1168.658102972" watchObservedRunningTime="2025-10-08 22:07:48.833930712 +0000 UTC m=+1168.659316462" Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.824271 4739 generic.go:334] "Generic (PLEG): container finished" podID="7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" containerID="33ba149770811cf8f6d0b06f8ac779b8ec2327cefbd40a418b21fc7450b41ede" exitCode=0 Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.828327 4739 generic.go:334] "Generic (PLEG): container finished" podID="a8260021-8bdd-4550-8136-e28f19e51159" containerID="9412097d0762b9b176533b40373e0f1de0234ed006c93620516aaf9eb2fb49bb" exitCode=0 Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.828353 4739 generic.go:334] "Generic (PLEG): container finished" podID="a8260021-8bdd-4550-8136-e28f19e51159" containerID="ff6192f35a9d0797a650513d137cb8f8962c7971f8a08212ac753598e7f5d929" exitCode=143 Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.836536 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-prlqk" event={"ID":"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7","Type":"ContainerDied","Data":"33ba149770811cf8f6d0b06f8ac779b8ec2327cefbd40a418b21fc7450b41ede"} Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.836584 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerDied","Data":"9412097d0762b9b176533b40373e0f1de0234ed006c93620516aaf9eb2fb49bb"} Oct 08 22:07:49 crc kubenswrapper[4739]: I1008 22:07:49.836598 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerDied","Data":"ff6192f35a9d0797a650513d137cb8f8962c7971f8a08212ac753598e7f5d929"} Oct 08 22:07:54 crc kubenswrapper[4739]: I1008 22:07:54.734289 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:07:54 crc kubenswrapper[4739]: I1008 22:07:54.823872 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:07:54 crc kubenswrapper[4739]: I1008 22:07:54.824017 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" containerID="cri-o://325aa857c00b6f969fa4f65fed517b578e88c8f1204019a345845ffb1c7fd252" gracePeriod=10 Oct 08 22:07:55 crc kubenswrapper[4739]: I1008 22:07:55.927185 4739 generic.go:334] "Generic (PLEG): container finished" podID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerID="325aa857c00b6f969fa4f65fed517b578e88c8f1204019a345845ffb1c7fd252" exitCode=0 Oct 08 22:07:55 crc kubenswrapper[4739]: I1008 22:07:55.927275 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" event={"ID":"841ddc2d-90ad-446e-a869-fe02ec8ce059","Type":"ContainerDied","Data":"325aa857c00b6f969fa4f65fed517b578e88c8f1204019a345845ffb1c7fd252"} Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.785863 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.793042 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919326 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919378 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919440 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919476 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919496 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919516 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919537 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rm2d\" (UniqueName: \"kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919561 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919593 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919610 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvjnd\" (UniqueName: \"kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919631 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys\") pod \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\" (UID: \"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919746 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919777 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.919798 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run\") pod \"a8260021-8bdd-4550-8136-e28f19e51159\" (UID: \"a8260021-8bdd-4550-8136-e28f19e51159\") " Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.920590 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.945403 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs" (OuterVolumeSpecName: "logs") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.955462 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts" (OuterVolumeSpecName: "scripts") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.955807 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts" (OuterVolumeSpecName: "scripts") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.955862 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d" (OuterVolumeSpecName: "kube-api-access-7rm2d") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "kube-api-access-7rm2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.955967 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.956228 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.967602 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd" (OuterVolumeSpecName: "kube-api-access-dvjnd") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "kube-api-access-dvjnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.976217 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.976581 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-prlqk" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.976668 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-prlqk" event={"ID":"7aa75c4b-8e80-4dc3-a46a-63419b1e46c7","Type":"ContainerDied","Data":"0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3"} Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.976713 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e443a6f79f09a2821083536dcac0715b159dfd059ad8e7c879548dba1f76db3" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.979471 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8260021-8bdd-4550-8136-e28f19e51159","Type":"ContainerDied","Data":"45185e485a6ded31ee5afced35430a9035533d66c6e45c7abc8fa2a945fb3953"} Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.979531 4739 scope.go:117] "RemoveContainer" containerID="9412097d0762b9b176533b40373e0f1de0234ed006c93620516aaf9eb2fb49bb" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.979698 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.980434 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:56 crc kubenswrapper[4739]: I1008 22:07:56.997642 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data" (OuterVolumeSpecName: "config-data") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.000760 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data" (OuterVolumeSpecName: "config-data") pod "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" (UID: "7aa75c4b-8e80-4dc3-a46a-63419b1e46c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.006879 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.011836 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8260021-8bdd-4550-8136-e28f19e51159" (UID: "a8260021-8bdd-4550-8136-e28f19e51159"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021351 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021379 4739 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021391 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021401 4739 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021410 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021438 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021448 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rm2d\" (UniqueName: \"kubernetes.io/projected/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-kube-api-access-7rm2d\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021459 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021467 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021476 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvjnd\" (UniqueName: \"kubernetes.io/projected/a8260021-8bdd-4550-8136-e28f19e51159-kube-api-access-dvjnd\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021485 4739 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021493 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021500 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8260021-8bdd-4550-8136-e28f19e51159-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.021508 4739 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8260021-8bdd-4550-8136-e28f19e51159-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.041423 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.122924 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.340064 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.353203 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.362505 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:57 crc kubenswrapper[4739]: E1008 22:07:57.363678 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-httpd" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.363702 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-httpd" Oct 08 22:07:57 crc kubenswrapper[4739]: E1008 22:07:57.363732 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-log" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.363740 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-log" Oct 08 22:07:57 crc kubenswrapper[4739]: E1008 22:07:57.363771 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" containerName="keystone-bootstrap" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.363781 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" containerName="keystone-bootstrap" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.364120 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-log" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.364171 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8260021-8bdd-4550-8136-e28f19e51159" containerName="glance-httpd" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.364188 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" containerName="keystone-bootstrap" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.371013 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.373780 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.381620 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.406472 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531035 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531410 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531433 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk5jl\" (UniqueName: \"kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531519 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531550 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531607 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.531836 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634227 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634279 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634326 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634348 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk5jl\" (UniqueName: \"kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634365 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634441 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634459 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.634882 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.635660 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.636846 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.645223 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.645814 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.646179 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.654765 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.671868 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk5jl\" (UniqueName: \"kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.730441 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " pod="openstack/glance-default-external-api-0" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.846308 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8260021-8bdd-4550-8136-e28f19e51159" path="/var/lib/kubelet/pods/a8260021-8bdd-4550-8136-e28f19e51159/volumes" Oct 08 22:07:57 crc kubenswrapper[4739]: I1008 22:07:57.996296 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-prlqk"] Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.002890 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-prlqk"] Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.005311 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.107173 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dtlhn"] Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.108495 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.112527 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.112656 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q58bw" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.112530 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.112554 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.121521 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dtlhn"] Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.253678 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.254263 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.254400 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgb7n\" (UniqueName: \"kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.254461 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.254501 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.254551 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.356399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.356886 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.357040 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.357093 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgb7n\" (UniqueName: \"kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.357134 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.357199 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.364563 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.365081 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.366114 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.369845 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.370614 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.385694 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgb7n\" (UniqueName: \"kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n\") pod \"keystone-bootstrap-dtlhn\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:58 crc kubenswrapper[4739]: I1008 22:07:58.441139 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:07:59 crc kubenswrapper[4739]: I1008 22:07:59.832193 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa75c4b-8e80-4dc3-a46a-63419b1e46c7" path="/var/lib/kubelet/pods/7aa75c4b-8e80-4dc3-a46a-63419b1e46c7/volumes" Oct 08 22:08:01 crc kubenswrapper[4739]: I1008 22:08:01.873589 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 08 22:08:06 crc kubenswrapper[4739]: I1008 22:08:06.875302 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 08 22:08:07 crc kubenswrapper[4739]: E1008 22:08:07.857226 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 08 22:08:07 crc kubenswrapper[4739]: E1008 22:08:07.857403 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwc5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-q5vhf_openstack(836f20c4-8401-4a21-a541-0dbc92430484): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:08:07 crc kubenswrapper[4739]: E1008 22:08:07.858625 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-q5vhf" podUID="836f20c4-8401-4a21-a541-0dbc92430484" Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.958220 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.971899 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.972109 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.972200 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.972259 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrzph\" (UniqueName: \"kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.972296 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.972330 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config\") pod \"841ddc2d-90ad-446e-a869-fe02ec8ce059\" (UID: \"841ddc2d-90ad-446e-a869-fe02ec8ce059\") " Oct 08 22:08:07 crc kubenswrapper[4739]: I1008 22:08:07.978892 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph" (OuterVolumeSpecName: "kube-api-access-jrzph") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "kube-api-access-jrzph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.035962 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config" (OuterVolumeSpecName: "config") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.042900 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.047618 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.048951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.052661 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "841ddc2d-90ad-446e-a869-fe02ec8ce059" (UID: "841ddc2d-90ad-446e-a869-fe02ec8ce059"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074031 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074065 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074079 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrzph\" (UniqueName: \"kubernetes.io/projected/841ddc2d-90ad-446e-a869-fe02ec8ce059-kube-api-access-jrzph\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074092 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074105 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.074116 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841ddc2d-90ad-446e-a869-fe02ec8ce059-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.125988 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" event={"ID":"841ddc2d-90ad-446e-a869-fe02ec8ce059","Type":"ContainerDied","Data":"002dee23d61a98ad749345d4d1997757afb7a079de688bd7529ca63ad40581cc"} Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.126005 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" Oct 08 22:08:08 crc kubenswrapper[4739]: E1008 22:08:08.128286 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-q5vhf" podUID="836f20c4-8401-4a21-a541-0dbc92430484" Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.162096 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:08:08 crc kubenswrapper[4739]: I1008 22:08:08.168760 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vvmw5"] Oct 08 22:08:09 crc kubenswrapper[4739]: I1008 22:08:09.839292 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" path="/var/lib/kubelet/pods/841ddc2d-90ad-446e-a869-fe02ec8ce059/volumes" Oct 08 22:08:10 crc kubenswrapper[4739]: I1008 22:08:10.842786 4739 scope.go:117] "RemoveContainer" containerID="ff6192f35a9d0797a650513d137cb8f8962c7971f8a08212ac753598e7f5d929" Oct 08 22:08:10 crc kubenswrapper[4739]: E1008 22:08:10.859395 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 08 22:08:10 crc kubenswrapper[4739]: E1008 22:08:10.859593 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g29mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vvmhx_openstack(4da3e49a-b4ae-4375-893f-47d64b4eb0b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:08:10 crc kubenswrapper[4739]: E1008 22:08:10.861090 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vvmhx" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" Oct 08 22:08:10 crc kubenswrapper[4739]: I1008 22:08:10.998207 4739 scope.go:117] "RemoveContainer" containerID="325aa857c00b6f969fa4f65fed517b578e88c8f1204019a345845ffb1c7fd252" Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.076014 4739 scope.go:117] "RemoveContainer" containerID="8905234b68165c745cb7a6cab6039f7ab0e8e784b6cd5887243944a9c991a92b" Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.167855 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-674m2" event={"ID":"b6fd1196-cd2f-4951-ad50-5dc17dac4aac","Type":"ContainerStarted","Data":"31d526f78f7e9f5f32b87e4175245eff5b19970c45fdb4ba3de34243dab95054"} Oct 08 22:08:11 crc kubenswrapper[4739]: E1008 22:08:11.172408 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vvmhx" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.193107 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-674m2" podStartSLOduration=4.928379857 podStartE2EDuration="27.193090193s" podCreationTimestamp="2025-10-08 22:07:44 +0000 UTC" firstStartedPulling="2025-10-08 22:07:45.603211307 +0000 UTC m=+1165.428597057" lastFinishedPulling="2025-10-08 22:08:07.867921623 +0000 UTC m=+1187.693307393" observedRunningTime="2025-10-08 22:08:11.188872439 +0000 UTC m=+1191.014258189" watchObservedRunningTime="2025-10-08 22:08:11.193090193 +0000 UTC m=+1191.018475943" Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.322763 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dtlhn"] Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.516516 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:08:11 crc kubenswrapper[4739]: I1008 22:08:11.875986 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vvmw5" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.210956 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerStarted","Data":"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.211014 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerStarted","Data":"3c6b45b07b850409ed0af4d10714732c555012559f60dfb074cfe4edfbf8018e"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.215818 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtlhn" event={"ID":"fe3031d0-ec15-4a2e-b635-a067472da71d","Type":"ContainerStarted","Data":"ecaea620f32062a4add85f8e9e2142342a4fa7694e7647baabab0ac192b03702"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.215979 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtlhn" event={"ID":"fe3031d0-ec15-4a2e-b635-a067472da71d","Type":"ContainerStarted","Data":"55de198fa3b1ebe75a2cab628a77f3661a711db4e7d4b8cc113b3f4d24c562ba"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.217981 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerStarted","Data":"2105c5cf04f4b7bf905a864df4b0cafe0cc6629d89dc230faa9c2b64ba6009b5"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.220887 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerStarted","Data":"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae"} Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.220921 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-log" containerID="cri-o://ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" gracePeriod=30 Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.221018 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-httpd" containerID="cri-o://df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" gracePeriod=30 Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.241786 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dtlhn" podStartSLOduration=14.241768716 podStartE2EDuration="14.241768716s" podCreationTimestamp="2025-10-08 22:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:12.234806964 +0000 UTC m=+1192.060192714" watchObservedRunningTime="2025-10-08 22:08:12.241768716 +0000 UTC m=+1192.067154466" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.257419 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=28.257404951 podStartE2EDuration="28.257404951s" podCreationTimestamp="2025-10-08 22:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:12.253739781 +0000 UTC m=+1192.079125521" watchObservedRunningTime="2025-10-08 22:08:12.257404951 +0000 UTC m=+1192.082790701" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.788991 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.970770 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.970876 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.970907 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.970942 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.971024 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.971064 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqm5f\" (UniqueName: \"kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.971094 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.971123 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"25d8b032-6093-4a17-8ff4-1032b20847ca\" (UID: \"25d8b032-6093-4a17-8ff4-1032b20847ca\") " Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.973316 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs" (OuterVolumeSpecName: "logs") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.973472 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.977992 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts" (OuterVolumeSpecName: "scripts") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.978070 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:08:12 crc kubenswrapper[4739]: I1008 22:08:12.979870 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f" (OuterVolumeSpecName: "kube-api-access-tqm5f") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "kube-api-access-tqm5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.015423 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.028586 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.037024 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data" (OuterVolumeSpecName: "config-data") pod "25d8b032-6093-4a17-8ff4-1032b20847ca" (UID: "25d8b032-6093-4a17-8ff4-1032b20847ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073349 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073386 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073398 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073408 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073419 4739 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25d8b032-6093-4a17-8ff4-1032b20847ca-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073430 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqm5f\" (UniqueName: \"kubernetes.io/projected/25d8b032-6093-4a17-8ff4-1032b20847ca-kube-api-access-tqm5f\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073441 4739 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d8b032-6093-4a17-8ff4-1032b20847ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.073481 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.092076 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.175448 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230697 4739 generic.go:334] "Generic (PLEG): container finished" podID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerID="df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" exitCode=0 Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230727 4739 generic.go:334] "Generic (PLEG): container finished" podID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerID="ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" exitCode=143 Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230760 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerDied","Data":"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae"} Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230784 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerDied","Data":"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee"} Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230793 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"25d8b032-6093-4a17-8ff4-1032b20847ca","Type":"ContainerDied","Data":"8d92e5d9ceac1a5a1b2686ea0cb5e29c5568559e7e1ca4ab8e06b951560733fc"} Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230807 4739 scope.go:117] "RemoveContainer" containerID="df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.230885 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.264247 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerStarted","Data":"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc"} Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.301488 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.30146378 podStartE2EDuration="16.30146378s" podCreationTimestamp="2025-10-08 22:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:13.291022913 +0000 UTC m=+1193.116408663" watchObservedRunningTime="2025-10-08 22:08:13.30146378 +0000 UTC m=+1193.126849530" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.339562 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.346179 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.383365 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.383875 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="init" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.383900 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="init" Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.383935 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.383947 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.383962 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-log" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.383971 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-log" Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.383990 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-httpd" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.384000 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-httpd" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.384225 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="841ddc2d-90ad-446e-a869-fe02ec8ce059" containerName="dnsmasq-dns" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.384254 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-log" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.384286 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" containerName="glance-httpd" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.386097 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.389004 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.389452 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.394847 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481289 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481373 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481402 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481437 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481459 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481485 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481500 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxtr\" (UniqueName: \"kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.481525 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582600 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582664 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582689 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582716 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582731 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxtr\" (UniqueName: \"kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582757 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582809 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.582846 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.583227 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.583294 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.583996 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.595628 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.596097 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.600024 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.600800 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.629337 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxtr\" (UniqueName: \"kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.644537 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.709356 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.737289 4739 scope.go:117] "RemoveContainer" containerID="ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.809380 4739 scope.go:117] "RemoveContainer" containerID="df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.809943 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae\": container with ID starting with df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae not found: ID does not exist" containerID="df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.810175 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae"} err="failed to get container status \"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae\": rpc error: code = NotFound desc = could not find container \"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae\": container with ID starting with df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae not found: ID does not exist" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.810216 4739 scope.go:117] "RemoveContainer" containerID="ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" Oct 08 22:08:13 crc kubenswrapper[4739]: E1008 22:08:13.810532 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee\": container with ID starting with ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee not found: ID does not exist" containerID="ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.810586 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee"} err="failed to get container status \"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee\": rpc error: code = NotFound desc = could not find container \"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee\": container with ID starting with ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee not found: ID does not exist" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.810620 4739 scope.go:117] "RemoveContainer" containerID="df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.811311 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae"} err="failed to get container status \"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae\": rpc error: code = NotFound desc = could not find container \"df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae\": container with ID starting with df1107a79234a68259a298e3675f36915a1e8a90bc9ce4cc6d3f202c75729aae not found: ID does not exist" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.811337 4739 scope.go:117] "RemoveContainer" containerID="ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.811750 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee"} err="failed to get container status \"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee\": rpc error: code = NotFound desc = could not find container \"ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee\": container with ID starting with ac3db39cf105a9823dd5cfb06859ffbdbfe818fdaa84553dc8e924857b2fecee not found: ID does not exist" Oct 08 22:08:13 crc kubenswrapper[4739]: I1008 22:08:13.832846 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d8b032-6093-4a17-8ff4-1032b20847ca" path="/var/lib/kubelet/pods/25d8b032-6093-4a17-8ff4-1032b20847ca/volumes" Oct 08 22:08:14 crc kubenswrapper[4739]: I1008 22:08:14.279346 4739 generic.go:334] "Generic (PLEG): container finished" podID="b6fd1196-cd2f-4951-ad50-5dc17dac4aac" containerID="31d526f78f7e9f5f32b87e4175245eff5b19970c45fdb4ba3de34243dab95054" exitCode=0 Oct 08 22:08:14 crc kubenswrapper[4739]: I1008 22:08:14.279652 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-674m2" event={"ID":"b6fd1196-cd2f-4951-ad50-5dc17dac4aac","Type":"ContainerDied","Data":"31d526f78f7e9f5f32b87e4175245eff5b19970c45fdb4ba3de34243dab95054"} Oct 08 22:08:14 crc kubenswrapper[4739]: I1008 22:08:14.282965 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerStarted","Data":"93a0d9d2a91e7922ca3f82a71f6212f214805f0ca1d86712b249a96acfd61d74"} Oct 08 22:08:14 crc kubenswrapper[4739]: I1008 22:08:14.394115 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:08:14 crc kubenswrapper[4739]: W1008 22:08:14.401506 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1240189_197f_4fc9_98a7_538ffdd522da.slice/crio-b8833c8a1529d44580ea3d2cf6a1bc3672160a5169797c894a80ee7234cf3c7e WatchSource:0}: Error finding container b8833c8a1529d44580ea3d2cf6a1bc3672160a5169797c894a80ee7234cf3c7e: Status 404 returned error can't find the container with id b8833c8a1529d44580ea3d2cf6a1bc3672160a5169797c894a80ee7234cf3c7e Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.293932 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerStarted","Data":"efac728265e15cbfbd6a69f1dea53010d867fff26f1210e7d9fe3880bd92d2da"} Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.294320 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerStarted","Data":"b8833c8a1529d44580ea3d2cf6a1bc3672160a5169797c894a80ee7234cf3c7e"} Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.297534 4739 generic.go:334] "Generic (PLEG): container finished" podID="b1381344-e404-4d04-bd00-667cfc882bcc" containerID="e0caa62d8003ed5f6c166b455a89ac239b00aee43ad29b2ef54ce2ab075d3b0d" exitCode=0 Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.297593 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b64hw" event={"ID":"b1381344-e404-4d04-bd00-667cfc882bcc","Type":"ContainerDied","Data":"e0caa62d8003ed5f6c166b455a89ac239b00aee43ad29b2ef54ce2ab075d3b0d"} Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.301319 4739 generic.go:334] "Generic (PLEG): container finished" podID="fe3031d0-ec15-4a2e-b635-a067472da71d" containerID="ecaea620f32062a4add85f8e9e2142342a4fa7694e7647baabab0ac192b03702" exitCode=0 Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.301355 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtlhn" event={"ID":"fe3031d0-ec15-4a2e-b635-a067472da71d","Type":"ContainerDied","Data":"ecaea620f32062a4add85f8e9e2142342a4fa7694e7647baabab0ac192b03702"} Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.652572 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-674m2" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.727384 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle\") pod \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.727607 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs\") pod \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.727634 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts\") pod \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.727664 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data\") pod \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.727686 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cf86\" (UniqueName: \"kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86\") pod \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\" (UID: \"b6fd1196-cd2f-4951-ad50-5dc17dac4aac\") " Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.728039 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs" (OuterVolumeSpecName: "logs") pod "b6fd1196-cd2f-4951-ad50-5dc17dac4aac" (UID: "b6fd1196-cd2f-4951-ad50-5dc17dac4aac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.733318 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86" (OuterVolumeSpecName: "kube-api-access-9cf86") pod "b6fd1196-cd2f-4951-ad50-5dc17dac4aac" (UID: "b6fd1196-cd2f-4951-ad50-5dc17dac4aac"). InnerVolumeSpecName "kube-api-access-9cf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.733336 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts" (OuterVolumeSpecName: "scripts") pod "b6fd1196-cd2f-4951-ad50-5dc17dac4aac" (UID: "b6fd1196-cd2f-4951-ad50-5dc17dac4aac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.753293 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data" (OuterVolumeSpecName: "config-data") pod "b6fd1196-cd2f-4951-ad50-5dc17dac4aac" (UID: "b6fd1196-cd2f-4951-ad50-5dc17dac4aac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.753752 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6fd1196-cd2f-4951-ad50-5dc17dac4aac" (UID: "b6fd1196-cd2f-4951-ad50-5dc17dac4aac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.829260 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.829288 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.829297 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.829306 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cf86\" (UniqueName: \"kubernetes.io/projected/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-kube-api-access-9cf86\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:15 crc kubenswrapper[4739]: I1008 22:08:15.829317 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6fd1196-cd2f-4951-ad50-5dc17dac4aac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.335401 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-674m2" event={"ID":"b6fd1196-cd2f-4951-ad50-5dc17dac4aac","Type":"ContainerDied","Data":"97b158247e0eb50db4dc5be226887f0036a63053d5388096d0e6368bdb227be3"} Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.335763 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97b158247e0eb50db4dc5be226887f0036a63053d5388096d0e6368bdb227be3" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.335434 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-674m2" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.345658 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerStarted","Data":"8752ddc99cd48cd577f2eb560e4f1a5194fef80d51dab2eeeb20d1ab2e4bc013"} Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.368137 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.368101554 podStartE2EDuration="3.368101554s" podCreationTimestamp="2025-10-08 22:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:16.363226345 +0000 UTC m=+1196.188612095" watchObservedRunningTime="2025-10-08 22:08:16.368101554 +0000 UTC m=+1196.193487304" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.421456 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6c88568bb8-rh6ln"] Oct 08 22:08:16 crc kubenswrapper[4739]: E1008 22:08:16.421801 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fd1196-cd2f-4951-ad50-5dc17dac4aac" containerName="placement-db-sync" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.421819 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fd1196-cd2f-4951-ad50-5dc17dac4aac" containerName="placement-db-sync" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.421994 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6fd1196-cd2f-4951-ad50-5dc17dac4aac" containerName="placement-db-sync" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.422922 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.433379 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.433482 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.433764 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.434556 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bskv6" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.437010 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.445079 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c88568bb8-rh6ln"] Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543507 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de51af0-00c3-4a08-a13b-819a118cb604-logs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543589 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-public-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543619 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-combined-ca-bundle\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543636 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-scripts\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543660 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-internal-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543713 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-config-data\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.543750 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtzq\" (UniqueName: \"kubernetes.io/projected/4de51af0-00c3-4a08-a13b-819a118cb604-kube-api-access-fxtzq\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645318 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-internal-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645415 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-config-data\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645461 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxtzq\" (UniqueName: \"kubernetes.io/projected/4de51af0-00c3-4a08-a13b-819a118cb604-kube-api-access-fxtzq\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645493 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de51af0-00c3-4a08-a13b-819a118cb604-logs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645543 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-public-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645572 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-combined-ca-bundle\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645594 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-scripts\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.645988 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4de51af0-00c3-4a08-a13b-819a118cb604-logs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.649746 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-scripts\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.649806 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-internal-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.651602 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-combined-ca-bundle\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.653585 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-public-tls-certs\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.661622 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de51af0-00c3-4a08-a13b-819a118cb604-config-data\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.661914 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxtzq\" (UniqueName: \"kubernetes.io/projected/4de51af0-00c3-4a08-a13b-819a118cb604-kube-api-access-fxtzq\") pod \"placement-6c88568bb8-rh6ln\" (UID: \"4de51af0-00c3-4a08-a13b-819a118cb604\") " pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:16 crc kubenswrapper[4739]: I1008 22:08:16.753236 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.272798 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.355981 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.356070 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.356133 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.356187 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.356214 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.356249 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgb7n\" (UniqueName: \"kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n\") pod \"fe3031d0-ec15-4a2e-b635-a067472da71d\" (UID: \"fe3031d0-ec15-4a2e-b635-a067472da71d\") " Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.386493 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts" (OuterVolumeSpecName: "scripts") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.386573 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n" (OuterVolumeSpecName: "kube-api-access-jgb7n") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "kube-api-access-jgb7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.386607 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.386774 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.388740 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtlhn" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.388956 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtlhn" event={"ID":"fe3031d0-ec15-4a2e-b635-a067472da71d","Type":"ContainerDied","Data":"55de198fa3b1ebe75a2cab628a77f3661a711db4e7d4b8cc113b3f4d24c562ba"} Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.388985 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55de198fa3b1ebe75a2cab628a77f3661a711db4e7d4b8cc113b3f4d24c562ba" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.404336 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data" (OuterVolumeSpecName: "config-data") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.454193 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55468d9c4f-z8pn5"] Oct 08 22:08:17 crc kubenswrapper[4739]: E1008 22:08:17.454688 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3031d0-ec15-4a2e-b635-a067472da71d" containerName="keystone-bootstrap" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.454772 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3031d0-ec15-4a2e-b635-a067472da71d" containerName="keystone-bootstrap" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.454999 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3031d0-ec15-4a2e-b635-a067472da71d" containerName="keystone-bootstrap" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.455679 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.458270 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.462467 4739 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.462587 4739 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.462655 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.462737 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgb7n\" (UniqueName: \"kubernetes.io/projected/fe3031d0-ec15-4a2e-b635-a067472da71d-kube-api-access-jgb7n\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.461624 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.461664 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.467498 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe3031d0-ec15-4a2e-b635-a067472da71d" (UID: "fe3031d0-ec15-4a2e-b635-a067472da71d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.471927 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468d9c4f-z8pn5"] Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564680 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-config-data\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564731 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-scripts\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564756 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-public-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564799 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-credential-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564824 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-combined-ca-bundle\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564850 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvnx7\" (UniqueName: \"kubernetes.io/projected/7923886e-2cbf-489b-aabd-aa49c710fbf0-kube-api-access-fvnx7\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564889 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-fernet-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564916 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-internal-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.564983 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3031d0-ec15-4a2e-b635-a067472da71d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668286 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-fernet-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668389 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-internal-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668577 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-config-data\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668622 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-scripts\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668661 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-public-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668758 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-credential-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668792 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-combined-ca-bundle\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.668810 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvnx7\" (UniqueName: \"kubernetes.io/projected/7923886e-2cbf-489b-aabd-aa49c710fbf0-kube-api-access-fvnx7\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.672764 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-credential-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.673018 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-config-data\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.673089 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-combined-ca-bundle\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.673430 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-public-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.673863 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-internal-tls-certs\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.674662 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-scripts\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.683608 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7923886e-2cbf-489b-aabd-aa49c710fbf0-fernet-keys\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.685347 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvnx7\" (UniqueName: \"kubernetes.io/projected/7923886e-2cbf-489b-aabd-aa49c710fbf0-kube-api-access-fvnx7\") pod \"keystone-55468d9c4f-z8pn5\" (UID: \"7923886e-2cbf-489b-aabd-aa49c710fbf0\") " pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:17 crc kubenswrapper[4739]: I1008 22:08:17.801929 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.005702 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.005984 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.050400 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.056446 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.397753 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:08:18 crc kubenswrapper[4739]: I1008 22:08:18.397801 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.187296 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b64hw" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.308048 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config\") pod \"b1381344-e404-4d04-bd00-667cfc882bcc\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.308277 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25lff\" (UniqueName: \"kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff\") pod \"b1381344-e404-4d04-bd00-667cfc882bcc\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.308372 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle\") pod \"b1381344-e404-4d04-bd00-667cfc882bcc\" (UID: \"b1381344-e404-4d04-bd00-667cfc882bcc\") " Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.314347 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff" (OuterVolumeSpecName: "kube-api-access-25lff") pod "b1381344-e404-4d04-bd00-667cfc882bcc" (UID: "b1381344-e404-4d04-bd00-667cfc882bcc"). InnerVolumeSpecName "kube-api-access-25lff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.337398 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1381344-e404-4d04-bd00-667cfc882bcc" (UID: "b1381344-e404-4d04-bd00-667cfc882bcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.347251 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config" (OuterVolumeSpecName: "config") pod "b1381344-e404-4d04-bd00-667cfc882bcc" (UID: "b1381344-e404-4d04-bd00-667cfc882bcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.408432 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerStarted","Data":"fc001940f5597bf6843b72765a794c9af33a7163075ae447bad6a3cfe28395cd"} Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.410355 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b64hw" event={"ID":"b1381344-e404-4d04-bd00-667cfc882bcc","Type":"ContainerDied","Data":"1fcdc2f65ff30cc230a2449d1f05bf02d0943dc703fa23bcb25ec314c8aa8c30"} Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.410388 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fcdc2f65ff30cc230a2449d1f05bf02d0943dc703fa23bcb25ec314c8aa8c30" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.410403 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b64hw" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.411632 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.411659 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1381344-e404-4d04-bd00-667cfc882bcc-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.411669 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25lff\" (UniqueName: \"kubernetes.io/projected/b1381344-e404-4d04-bd00-667cfc882bcc-kube-api-access-25lff\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.526284 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c88568bb8-rh6ln"] Oct 08 22:08:19 crc kubenswrapper[4739]: I1008 22:08:19.549578 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468d9c4f-z8pn5"] Oct 08 22:08:19 crc kubenswrapper[4739]: W1008 22:08:19.553115 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7923886e_2cbf_489b_aabd_aa49c710fbf0.slice/crio-5f38ae0eba63785ff6d81da0786d391f9ca78cbf9ef9890491b0cb979ff5d28d WatchSource:0}: Error finding container 5f38ae0eba63785ff6d81da0786d391f9ca78cbf9ef9890491b0cb979ff5d28d: Status 404 returned error can't find the container with id 5f38ae0eba63785ff6d81da0786d391f9ca78cbf9ef9890491b0cb979ff5d28d Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.401691 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.441481 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468d9c4f-z8pn5" event={"ID":"7923886e-2cbf-489b-aabd-aa49c710fbf0","Type":"ContainerStarted","Data":"ad840c2c1908693346d4ffb797f649f85001ee316a990db0dc0c54feedd44257"} Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.441528 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468d9c4f-z8pn5" event={"ID":"7923886e-2cbf-489b-aabd-aa49c710fbf0","Type":"ContainerStarted","Data":"5f38ae0eba63785ff6d81da0786d391f9ca78cbf9ef9890491b0cb979ff5d28d"} Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.441687 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.443254 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.443360 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c88568bb8-rh6ln" event={"ID":"4de51af0-00c3-4a08-a13b-819a118cb604","Type":"ContainerStarted","Data":"0dbdea33850f1577f05a30c2160b148d924673fddcb229747143ae52a4424971"} Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.443414 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c88568bb8-rh6ln" event={"ID":"4de51af0-00c3-4a08-a13b-819a118cb604","Type":"ContainerStarted","Data":"143b5e34bc494ac5fe1c5f0453e1a03dabd32e82c8af5376f45a0a3ac22613b0"} Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.443435 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c88568bb8-rh6ln" event={"ID":"4de51af0-00c3-4a08-a13b-819a118cb604","Type":"ContainerStarted","Data":"6d574012380d7335b58a2fc1e153f2f088496504a76d7bdb57aa048316262991"} Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.487754 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:20 crc kubenswrapper[4739]: E1008 22:08:20.488263 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1381344-e404-4d04-bd00-667cfc882bcc" containerName="neutron-db-sync" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.488289 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1381344-e404-4d04-bd00-667cfc882bcc" containerName="neutron-db-sync" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.488463 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1381344-e404-4d04-bd00-667cfc882bcc" containerName="neutron-db-sync" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.491702 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.510854 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.520221 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55468d9c4f-z8pn5" podStartSLOduration=3.520196498 podStartE2EDuration="3.520196498s" podCreationTimestamp="2025-10-08 22:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:20.497429496 +0000 UTC m=+1200.322815246" watchObservedRunningTime="2025-10-08 22:08:20.520196498 +0000 UTC m=+1200.345582248" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547628 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547739 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg5mr\" (UniqueName: \"kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547800 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547892 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547934 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.547951 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.551772 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6c88568bb8-rh6ln" podStartSLOduration=4.551757004 podStartE2EDuration="4.551757004s" podCreationTimestamp="2025-10-08 22:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:20.538882547 +0000 UTC m=+1200.364268297" watchObservedRunningTime="2025-10-08 22:08:20.551757004 +0000 UTC m=+1200.377142754" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.608896 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.610357 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.616809 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4z5dg" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.616968 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.617160 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.617313 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.635538 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651003 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651080 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651110 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651158 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg5mr\" (UniqueName: \"kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651184 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651202 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651224 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651618 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651666 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszdx\" (UniqueName: \"kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651701 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.651719 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.652228 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.652715 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.652972 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.654006 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.657702 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.679859 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg5mr\" (UniqueName: \"kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr\") pod \"dnsmasq-dns-84b966f6c9-6tj2b\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.731171 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.753944 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.753999 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.754061 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszdx\" (UniqueName: \"kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.754088 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.754136 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.757367 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.757422 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.760852 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.764208 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.782888 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszdx\" (UniqueName: \"kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx\") pod \"neutron-cf7957586-49vcm\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.819721 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:20 crc kubenswrapper[4739]: I1008 22:08:20.937648 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:21 crc kubenswrapper[4739]: I1008 22:08:21.454338 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:21 crc kubenswrapper[4739]: I1008 22:08:21.454379 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:21 crc kubenswrapper[4739]: I1008 22:08:21.770749 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:08:21 crc kubenswrapper[4739]: I1008 22:08:21.773321 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.523905 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dc6d4cfc5-7ks2h"] Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.526037 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.530829 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.531017 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.540130 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc6d4cfc5-7ks2h"] Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609041 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-ovndb-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609138 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-internal-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609266 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-combined-ca-bundle\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609288 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609311 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-public-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609355 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-httpd-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.609387 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7fk\" (UniqueName: \"kubernetes.io/projected/f2a09a54-dd22-4b47-b5bd-49685c152d9f-kube-api-access-ck7fk\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.669648 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714652 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7fk\" (UniqueName: \"kubernetes.io/projected/f2a09a54-dd22-4b47-b5bd-49685c152d9f-kube-api-access-ck7fk\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714725 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-ovndb-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714760 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-internal-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714829 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-combined-ca-bundle\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714849 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714871 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-public-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.714895 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-httpd-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.738442 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.740772 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-public-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.741853 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-httpd-config\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.748493 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-combined-ca-bundle\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.749082 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-ovndb-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.755978 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a09a54-dd22-4b47-b5bd-49685c152d9f-internal-tls-certs\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.762185 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7fk\" (UniqueName: \"kubernetes.io/projected/f2a09a54-dd22-4b47-b5bd-49685c152d9f-kube-api-access-ck7fk\") pod \"neutron-dc6d4cfc5-7ks2h\" (UID: \"f2a09a54-dd22-4b47-b5bd-49685c152d9f\") " pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.850056 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:22 crc kubenswrapper[4739]: I1008 22:08:22.939120 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.437887 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dc6d4cfc5-7ks2h"] Oct 08 22:08:23 crc kubenswrapper[4739]: W1008 22:08:23.443301 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a09a54_dd22_4b47_b5bd_49685c152d9f.slice/crio-820b5150b7b144f3fd1e2e7e849421e9776690df9e3e747e163f7f96348fd053 WatchSource:0}: Error finding container 820b5150b7b144f3fd1e2e7e849421e9776690df9e3e747e163f7f96348fd053: Status 404 returned error can't find the container with id 820b5150b7b144f3fd1e2e7e849421e9776690df9e3e747e163f7f96348fd053 Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.479956 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc6d4cfc5-7ks2h" event={"ID":"f2a09a54-dd22-4b47-b5bd-49685c152d9f","Type":"ContainerStarted","Data":"820b5150b7b144f3fd1e2e7e849421e9776690df9e3e747e163f7f96348fd053"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.483875 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerStarted","Data":"9bf91a6876d628402308360b4414bb932f05d5681ecf00e0a542a9c13abe57e0"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.483903 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerStarted","Data":"48066fd56d175a613e4a0fe33a8e50ae919b1b5b7746e611b87b000561c2bf00"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.483913 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerStarted","Data":"7fcbe14acfab568bf24ac68fd5233c80947b9541b9785b773ea0c5765c5bdd9a"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.484034 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.488031 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q5vhf" event={"ID":"836f20c4-8401-4a21-a541-0dbc92430484","Type":"ContainerStarted","Data":"b19346be300cfdbed834b66c2d55b6e0c3ab28352d9bf5313453e5fef1fbc2d1"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.489238 4739 generic.go:334] "Generic (PLEG): container finished" podID="11aeee8b-976d-4802-a571-476642cd56c8" containerID="74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51" exitCode=0 Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.489268 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" event={"ID":"11aeee8b-976d-4802-a571-476642cd56c8","Type":"ContainerDied","Data":"74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.489291 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" event={"ID":"11aeee8b-976d-4802-a571-476642cd56c8","Type":"ContainerStarted","Data":"1278ce6c1ea09312e610f2be90667588487be40795485f86b3dfaaeb4e3f2703"} Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.508777 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cf7957586-49vcm" podStartSLOduration=3.508758829 podStartE2EDuration="3.508758829s" podCreationTimestamp="2025-10-08 22:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:23.500054695 +0000 UTC m=+1203.325440465" watchObservedRunningTime="2025-10-08 22:08:23.508758829 +0000 UTC m=+1203.334144589" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.540648 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q5vhf" podStartSLOduration=2.066782207 podStartE2EDuration="38.540629614s" podCreationTimestamp="2025-10-08 22:07:45 +0000 UTC" firstStartedPulling="2025-10-08 22:07:46.723269037 +0000 UTC m=+1166.548654787" lastFinishedPulling="2025-10-08 22:08:23.197116444 +0000 UTC m=+1203.022502194" observedRunningTime="2025-10-08 22:08:23.53639431 +0000 UTC m=+1203.361780090" watchObservedRunningTime="2025-10-08 22:08:23.540629614 +0000 UTC m=+1203.366015364" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.709496 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.710340 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.760230 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:23 crc kubenswrapper[4739]: I1008 22:08:23.764237 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.498324 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" event={"ID":"11aeee8b-976d-4802-a571-476642cd56c8","Type":"ContainerStarted","Data":"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba"} Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.500943 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.501860 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc6d4cfc5-7ks2h" event={"ID":"f2a09a54-dd22-4b47-b5bd-49685c152d9f","Type":"ContainerStarted","Data":"7cbe12bb0e9d55ea98dc5ea4997ff076cc58474e1d5ba2b51eef4dbbb8cf7ea9"} Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.501900 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dc6d4cfc5-7ks2h" event={"ID":"f2a09a54-dd22-4b47-b5bd-49685c152d9f","Type":"ContainerStarted","Data":"b2e6fa1ba72bc06b12b5415f0c639be84374f9382b8472dbd5d05fcac457669b"} Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.502788 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.502956 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.516366 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" podStartSLOduration=4.516351121 podStartE2EDuration="4.516351121s" podCreationTimestamp="2025-10-08 22:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:24.513388428 +0000 UTC m=+1204.338774178" watchObservedRunningTime="2025-10-08 22:08:24.516351121 +0000 UTC m=+1204.341736871" Oct 08 22:08:24 crc kubenswrapper[4739]: I1008 22:08:24.534466 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dc6d4cfc5-7ks2h" podStartSLOduration=2.534447136 podStartE2EDuration="2.534447136s" podCreationTimestamp="2025-10-08 22:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:24.530226112 +0000 UTC m=+1204.355611862" watchObservedRunningTime="2025-10-08 22:08:24.534447136 +0000 UTC m=+1204.359832896" Oct 08 22:08:25 crc kubenswrapper[4739]: I1008 22:08:25.510627 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:26 crc kubenswrapper[4739]: I1008 22:08:26.403085 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:26 crc kubenswrapper[4739]: I1008 22:08:26.454704 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:08:26 crc kubenswrapper[4739]: I1008 22:08:26.548164 4739 generic.go:334] "Generic (PLEG): container finished" podID="836f20c4-8401-4a21-a541-0dbc92430484" containerID="b19346be300cfdbed834b66c2d55b6e0c3ab28352d9bf5313453e5fef1fbc2d1" exitCode=0 Oct 08 22:08:26 crc kubenswrapper[4739]: I1008 22:08:26.549162 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q5vhf" event={"ID":"836f20c4-8401-4a21-a541-0dbc92430484","Type":"ContainerDied","Data":"b19346be300cfdbed834b66c2d55b6e0c3ab28352d9bf5313453e5fef1fbc2d1"} Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.046730 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.160936 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwc5r\" (UniqueName: \"kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r\") pod \"836f20c4-8401-4a21-a541-0dbc92430484\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.161493 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle\") pod \"836f20c4-8401-4a21-a541-0dbc92430484\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.161894 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data\") pod \"836f20c4-8401-4a21-a541-0dbc92430484\" (UID: \"836f20c4-8401-4a21-a541-0dbc92430484\") " Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.165783 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r" (OuterVolumeSpecName: "kube-api-access-kwc5r") pod "836f20c4-8401-4a21-a541-0dbc92430484" (UID: "836f20c4-8401-4a21-a541-0dbc92430484"). InnerVolumeSpecName "kube-api-access-kwc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.169886 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "836f20c4-8401-4a21-a541-0dbc92430484" (UID: "836f20c4-8401-4a21-a541-0dbc92430484"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.193433 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "836f20c4-8401-4a21-a541-0dbc92430484" (UID: "836f20c4-8401-4a21-a541-0dbc92430484"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.264508 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwc5r\" (UniqueName: \"kubernetes.io/projected/836f20c4-8401-4a21-a541-0dbc92430484-kube-api-access-kwc5r\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.264546 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.264556 4739 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/836f20c4-8401-4a21-a541-0dbc92430484-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.578938 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerStarted","Data":"5bea2606f8eeb8c0401c72f82d3ad931869a77e9919053abcd5ce20dd567d7d4"} Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.579071 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-central-agent" containerID="cri-o://2105c5cf04f4b7bf905a864df4b0cafe0cc6629d89dc230faa9c2b64ba6009b5" gracePeriod=30 Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.579240 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="proxy-httpd" containerID="cri-o://5bea2606f8eeb8c0401c72f82d3ad931869a77e9919053abcd5ce20dd567d7d4" gracePeriod=30 Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.579281 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="sg-core" containerID="cri-o://fc001940f5597bf6843b72765a794c9af33a7163075ae447bad6a3cfe28395cd" gracePeriod=30 Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.579313 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-notification-agent" containerID="cri-o://93a0d9d2a91e7922ca3f82a71f6212f214805f0ca1d86712b249a96acfd61d74" gracePeriod=30 Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.579095 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.581337 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q5vhf" event={"ID":"836f20c4-8401-4a21-a541-0dbc92430484","Type":"ContainerDied","Data":"ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6"} Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.581373 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef48ecefa21e34fceb55d5b9cf7a0d387811442e573e73163da5ce72bbe2a4e6" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.581425 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q5vhf" Oct 08 22:08:29 crc kubenswrapper[4739]: I1008 22:08:29.606583 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.826553388 podStartE2EDuration="45.606566054s" podCreationTimestamp="2025-10-08 22:07:44 +0000 UTC" firstStartedPulling="2025-10-08 22:07:45.308263345 +0000 UTC m=+1165.133649095" lastFinishedPulling="2025-10-08 22:08:29.088276001 +0000 UTC m=+1208.913661761" observedRunningTime="2025-10-08 22:08:29.604824491 +0000 UTC m=+1209.430210241" watchObservedRunningTime="2025-10-08 22:08:29.606566054 +0000 UTC m=+1209.431951814" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.334359 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-66b4c9b85f-r8lds"] Oct 08 22:08:30 crc kubenswrapper[4739]: E1008 22:08:30.335066 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836f20c4-8401-4a21-a541-0dbc92430484" containerName="barbican-db-sync" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.335082 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="836f20c4-8401-4a21-a541-0dbc92430484" containerName="barbican-db-sync" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.335317 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="836f20c4-8401-4a21-a541-0dbc92430484" containerName="barbican-db-sync" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.336477 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.339788 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jgmrn" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.339995 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.345788 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.346629 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-66b4c9b85f-r8lds"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.361300 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-648fb84fdb-qmfb7"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.362737 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.368079 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.382129 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mx94\" (UniqueName: \"kubernetes.io/projected/1802465b-168a-449f-b8db-224a426d90ad-kube-api-access-7mx94\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.382208 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1802465b-168a-449f-b8db-224a426d90ad-logs\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.382292 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-combined-ca-bundle\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.382320 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.382353 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data-custom\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.393266 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-648fb84fdb-qmfb7"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.447334 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.447597 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="dnsmasq-dns" containerID="cri-o://3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba" gracePeriod=10 Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.452001 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484390 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-combined-ca-bundle\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484437 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484475 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data-custom\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484517 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data-custom\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484542 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mx94\" (UniqueName: \"kubernetes.io/projected/1802465b-168a-449f-b8db-224a426d90ad-kube-api-access-7mx94\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484566 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1802465b-168a-449f-b8db-224a426d90ad-logs\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484588 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-combined-ca-bundle\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484628 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mg2k\" (UniqueName: \"kubernetes.io/projected/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-kube-api-access-4mg2k\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484653 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.484671 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-logs\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.485689 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1802465b-168a-449f-b8db-224a426d90ad-logs\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.494340 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-combined-ca-bundle\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.496820 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.499764 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1802465b-168a-449f-b8db-224a426d90ad-config-data-custom\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.506239 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.513400 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.528442 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mx94\" (UniqueName: \"kubernetes.io/projected/1802465b-168a-449f-b8db-224a426d90ad-kube-api-access-7mx94\") pod \"barbican-worker-66b4c9b85f-r8lds\" (UID: \"1802465b-168a-449f-b8db-224a426d90ad\") " pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.536682 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.575455 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.577088 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.580885 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593077 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593199 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593261 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593341 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data-custom\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593393 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bltf\" (UniqueName: \"kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593474 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-combined-ca-bundle\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593517 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593576 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593621 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mg2k\" (UniqueName: \"kubernetes.io/projected/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-kube-api-access-4mg2k\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593666 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.593711 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-logs\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.594246 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-logs\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.609401 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.613299 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-config-data-custom\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.622385 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.624938 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-combined-ca-bundle\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.624973 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mg2k\" (UniqueName: \"kubernetes.io/projected/a8e9e5fe-49e3-4fea-8a8e-b853c479ce94-kube-api-access-4mg2k\") pod \"barbican-keystone-listener-648fb84fdb-qmfb7\" (UID: \"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94\") " pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.636487 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vvmhx" event={"ID":"4da3e49a-b4ae-4375-893f-47d64b4eb0b5","Type":"ContainerStarted","Data":"cfaea9716ef442a82d7d1442e5293c3a41541f51fe6a306462de76aad7abb370"} Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654555 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerDied","Data":"5bea2606f8eeb8c0401c72f82d3ad931869a77e9919053abcd5ce20dd567d7d4"} Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654515 4739 generic.go:334] "Generic (PLEG): container finished" podID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerID="5bea2606f8eeb8c0401c72f82d3ad931869a77e9919053abcd5ce20dd567d7d4" exitCode=0 Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654623 4739 generic.go:334] "Generic (PLEG): container finished" podID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerID="fc001940f5597bf6843b72765a794c9af33a7163075ae447bad6a3cfe28395cd" exitCode=2 Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654632 4739 generic.go:334] "Generic (PLEG): container finished" podID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerID="2105c5cf04f4b7bf905a864df4b0cafe0cc6629d89dc230faa9c2b64ba6009b5" exitCode=0 Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654645 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerDied","Data":"fc001940f5597bf6843b72765a794c9af33a7163075ae447bad6a3cfe28395cd"} Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.654656 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerDied","Data":"2105c5cf04f4b7bf905a864df4b0cafe0cc6629d89dc230faa9c2b64ba6009b5"} Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.665479 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-66b4c9b85f-r8lds" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.667059 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vvmhx" podStartSLOduration=3.151176809 podStartE2EDuration="45.667021847s" podCreationTimestamp="2025-10-08 22:07:45 +0000 UTC" firstStartedPulling="2025-10-08 22:07:46.53451526 +0000 UTC m=+1166.359901010" lastFinishedPulling="2025-10-08 22:08:29.050360268 +0000 UTC m=+1208.875746048" observedRunningTime="2025-10-08 22:08:30.655455492 +0000 UTC m=+1210.480841242" watchObservedRunningTime="2025-10-08 22:08:30.667021847 +0000 UTC m=+1210.492407597" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.682905 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.695554 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.695734 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jknvx\" (UniqueName: \"kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.695843 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bltf\" (UniqueName: \"kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.695960 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696123 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696238 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696342 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696733 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696847 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.696973 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.697127 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.698576 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.698871 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.699637 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.700139 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.701972 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.719454 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bltf\" (UniqueName: \"kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf\") pod \"dnsmasq-dns-75c8ddd69c-ds8pz\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.724737 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.798928 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.799020 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.799050 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jknvx\" (UniqueName: \"kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.799095 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.799114 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.799551 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.813430 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.817532 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jknvx\" (UniqueName: \"kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.820024 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.820758 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom\") pod \"barbican-api-557d86854-5qdxs\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:30 crc kubenswrapper[4739]: I1008 22:08:30.905610 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003706 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003791 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg5mr\" (UniqueName: \"kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003824 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003846 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003914 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.003975 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb\") pod \"11aeee8b-976d-4802-a571-476642cd56c8\" (UID: \"11aeee8b-976d-4802-a571-476642cd56c8\") " Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.029394 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr" (OuterVolumeSpecName: "kube-api-access-rg5mr") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "kube-api-access-rg5mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.032020 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.065504 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.069669 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.082809 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.106793 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.106832 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg5mr\" (UniqueName: \"kubernetes.io/projected/11aeee8b-976d-4802-a571-476642cd56c8-kube-api-access-rg5mr\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.106846 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.106858 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.110453 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.117481 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config" (OuterVolumeSpecName: "config") pod "11aeee8b-976d-4802-a571-476642cd56c8" (UID: "11aeee8b-976d-4802-a571-476642cd56c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.207756 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-66b4c9b85f-r8lds"] Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.208015 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.208047 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11aeee8b-976d-4802-a571-476642cd56c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:31 crc kubenswrapper[4739]: W1008 22:08:31.211394 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1802465b_168a_449f_b8db_224a426d90ad.slice/crio-61a5a6d315b15f846e6116b70ee5e768a304b179cc47d57a025abbcbfa9631f4 WatchSource:0}: Error finding container 61a5a6d315b15f846e6116b70ee5e768a304b179cc47d57a025abbcbfa9631f4: Status 404 returned error can't find the container with id 61a5a6d315b15f846e6116b70ee5e768a304b179cc47d57a025abbcbfa9631f4 Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.234829 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-648fb84fdb-qmfb7"] Oct 08 22:08:31 crc kubenswrapper[4739]: W1008 22:08:31.237840 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8e9e5fe_49e3_4fea_8a8e_b853c479ce94.slice/crio-8218db3b4f711e99bd015fb88428d180bc2ae929c064288f3b6433fa1b1a1ba7 WatchSource:0}: Error finding container 8218db3b4f711e99bd015fb88428d180bc2ae929c064288f3b6433fa1b1a1ba7: Status 404 returned error can't find the container with id 8218db3b4f711e99bd015fb88428d180bc2ae929c064288f3b6433fa1b1a1ba7 Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.403581 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.585667 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:31 crc kubenswrapper[4739]: W1008 22:08:31.585833 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f8a80f3_ff67_4b19_86ee_82a5198d860a.slice/crio-ae264a9960193bc861b6e37d4812bd1bcbb4c598ad7b6b4b78677a2c050e9474 WatchSource:0}: Error finding container ae264a9960193bc861b6e37d4812bd1bcbb4c598ad7b6b4b78677a2c050e9474: Status 404 returned error can't find the container with id ae264a9960193bc861b6e37d4812bd1bcbb4c598ad7b6b4b78677a2c050e9474 Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.672158 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerStarted","Data":"ae264a9960193bc861b6e37d4812bd1bcbb4c598ad7b6b4b78677a2c050e9474"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.673181 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66b4c9b85f-r8lds" event={"ID":"1802465b-168a-449f-b8db-224a426d90ad","Type":"ContainerStarted","Data":"61a5a6d315b15f846e6116b70ee5e768a304b179cc47d57a025abbcbfa9631f4"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.675209 4739 generic.go:334] "Generic (PLEG): container finished" podID="11aeee8b-976d-4802-a571-476642cd56c8" containerID="3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba" exitCode=0 Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.675254 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" event={"ID":"11aeee8b-976d-4802-a571-476642cd56c8","Type":"ContainerDied","Data":"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.675272 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" event={"ID":"11aeee8b-976d-4802-a571-476642cd56c8","Type":"ContainerDied","Data":"1278ce6c1ea09312e610f2be90667588487be40795485f86b3dfaaeb4e3f2703"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.675295 4739 scope.go:117] "RemoveContainer" containerID="3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.675455 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.681401 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" event={"ID":"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94","Type":"ContainerStarted","Data":"8218db3b4f711e99bd015fb88428d180bc2ae929c064288f3b6433fa1b1a1ba7"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.688803 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerStarted","Data":"17dd81c64960a088b8523839223940a4b3d5b574cb4f65cb7bbb5ddcbe08903a"} Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.710535 4739 scope.go:117] "RemoveContainer" containerID="74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.721952 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.731581 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-6tj2b"] Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.757553 4739 scope.go:117] "RemoveContainer" containerID="3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba" Oct 08 22:08:31 crc kubenswrapper[4739]: E1008 22:08:31.758038 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba\": container with ID starting with 3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba not found: ID does not exist" containerID="3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.758085 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba"} err="failed to get container status \"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba\": rpc error: code = NotFound desc = could not find container \"3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba\": container with ID starting with 3bea46fdb598aa91fd00df601703f1bf73f29a825c8fe2271077335e49246eba not found: ID does not exist" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.758112 4739 scope.go:117] "RemoveContainer" containerID="74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51" Oct 08 22:08:31 crc kubenswrapper[4739]: E1008 22:08:31.758492 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51\": container with ID starting with 74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51 not found: ID does not exist" containerID="74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.758523 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51"} err="failed to get container status \"74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51\": rpc error: code = NotFound desc = could not find container \"74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51\": container with ID starting with 74351b5c357ad1fbc3d11ff34928b2b52201daf4f576a797eaaee41cd07a3b51 not found: ID does not exist" Oct 08 22:08:31 crc kubenswrapper[4739]: I1008 22:08:31.835735 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11aeee8b-976d-4802-a571-476642cd56c8" path="/var/lib/kubelet/pods/11aeee8b-976d-4802-a571-476642cd56c8/volumes" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.879856 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56dcfd46c8-rpb55"] Oct 08 22:08:32 crc kubenswrapper[4739]: E1008 22:08:32.883691 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="init" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.883709 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="init" Oct 08 22:08:32 crc kubenswrapper[4739]: E1008 22:08:32.883723 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="dnsmasq-dns" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.883729 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="dnsmasq-dns" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.883945 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="dnsmasq-dns" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.884924 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.888603 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.890638 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.896869 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56dcfd46c8-rpb55"] Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.978591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7rl\" (UniqueName: \"kubernetes.io/projected/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-kube-api-access-4m7rl\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.978655 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-public-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.978691 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-internal-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.979021 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.979174 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-logs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.979307 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-combined-ca-bundle\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:32 crc kubenswrapper[4739]: I1008 22:08:32.979339 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data-custom\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081109 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081186 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-logs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081223 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-combined-ca-bundle\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081258 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data-custom\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081278 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7rl\" (UniqueName: \"kubernetes.io/projected/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-kube-api-access-4m7rl\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081643 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-public-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081735 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-logs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.081787 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-internal-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.088327 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.091403 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-public-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.091859 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-combined-ca-bundle\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.096889 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-config-data-custom\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.098348 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-internal-tls-certs\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.108935 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7rl\" (UniqueName: \"kubernetes.io/projected/9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59-kube-api-access-4m7rl\") pod \"barbican-api-56dcfd46c8-rpb55\" (UID: \"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59\") " pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.224324 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.661080 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56dcfd46c8-rpb55"] Oct 08 22:08:33 crc kubenswrapper[4739]: W1008 22:08:33.663436 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1e3799_0c6d_4e4c_ac0d_f6b0d9241e59.slice/crio-4e913177d7ab7cc6e989eefd9f31ce5b91dbbd0d1aafeb68af20e5a6764a9b72 WatchSource:0}: Error finding container 4e913177d7ab7cc6e989eefd9f31ce5b91dbbd0d1aafeb68af20e5a6764a9b72: Status 404 returned error can't find the container with id 4e913177d7ab7cc6e989eefd9f31ce5b91dbbd0d1aafeb68af20e5a6764a9b72 Oct 08 22:08:33 crc kubenswrapper[4739]: I1008 22:08:33.710067 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56dcfd46c8-rpb55" event={"ID":"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59","Type":"ContainerStarted","Data":"4e913177d7ab7cc6e989eefd9f31ce5b91dbbd0d1aafeb68af20e5a6764a9b72"} Oct 08 22:08:34 crc kubenswrapper[4739]: I1008 22:08:34.752894 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerStarted","Data":"67942d1cbdb84656d44a43fcf4a56238e8e51b874b6f72458474e1b52f9b8123"} Oct 08 22:08:34 crc kubenswrapper[4739]: I1008 22:08:34.756424 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerStarted","Data":"90a96c604828f6525467ff9c8893643a4f3cd46a097e0f7672d4d059e7a1f6f5"} Oct 08 22:08:34 crc kubenswrapper[4739]: I1008 22:08:34.759086 4739 generic.go:334] "Generic (PLEG): container finished" podID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerID="93a0d9d2a91e7922ca3f82a71f6212f214805f0ca1d86712b249a96acfd61d74" exitCode=0 Oct 08 22:08:34 crc kubenswrapper[4739]: I1008 22:08:34.759116 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerDied","Data":"93a0d9d2a91e7922ca3f82a71f6212f214805f0ca1d86712b249a96acfd61d74"} Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.774815 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56dcfd46c8-rpb55" event={"ID":"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59","Type":"ContainerStarted","Data":"c59710d248437249df7171f40d13b1df21453ff75c7d9397837d49e9f54b5ba2"} Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.776994 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerStarted","Data":"60b3279bde484979e02287ebf593141e1e965aa13e3bdd6fa962100197c9a2cd"} Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.778228 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.778262 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.779683 4739 generic.go:334] "Generic (PLEG): container finished" podID="439123d3-6874-4694-9790-f3ea65bde3a5" containerID="90a96c604828f6525467ff9c8893643a4f3cd46a097e0f7672d4d059e7a1f6f5" exitCode=0 Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.779710 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerDied","Data":"90a96c604828f6525467ff9c8893643a4f3cd46a097e0f7672d4d059e7a1f6f5"} Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.797910 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-557d86854-5qdxs" podStartSLOduration=5.797894291 podStartE2EDuration="5.797894291s" podCreationTimestamp="2025-10-08 22:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:35.79661911 +0000 UTC m=+1215.622004860" watchObservedRunningTime="2025-10-08 22:08:35.797894291 +0000 UTC m=+1215.623280041" Oct 08 22:08:35 crc kubenswrapper[4739]: I1008 22:08:35.822935 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-84b966f6c9-6tj2b" podUID="11aeee8b-976d-4802-a571-476642cd56c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: i/o timeout" Oct 08 22:08:37 crc kubenswrapper[4739]: I1008 22:08:37.987647 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.078682 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079090 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079129 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079137 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079229 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079295 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079370 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.079480 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxw9f\" (UniqueName: \"kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f\") pod \"d8fe907f-2579-491e-95b8-a71e264e9ed4\" (UID: \"d8fe907f-2579-491e-95b8-a71e264e9ed4\") " Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.080486 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.080677 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.080708 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8fe907f-2579-491e-95b8-a71e264e9ed4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.087766 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts" (OuterVolumeSpecName: "scripts") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.088033 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f" (OuterVolumeSpecName: "kube-api-access-bxw9f") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "kube-api-access-bxw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.159755 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.182615 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.182650 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.182675 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxw9f\" (UniqueName: \"kubernetes.io/projected/d8fe907f-2579-491e-95b8-a71e264e9ed4-kube-api-access-bxw9f\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.226389 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.259170 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data" (OuterVolumeSpecName: "config-data") pod "d8fe907f-2579-491e-95b8-a71e264e9ed4" (UID: "d8fe907f-2579-491e-95b8-a71e264e9ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.294668 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.294706 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fe907f-2579-491e-95b8-a71e264e9ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.818916 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56dcfd46c8-rpb55" event={"ID":"9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59","Type":"ContainerStarted","Data":"e1c4c23482cd1d058d6f3641ad3f040e2ff90917cef1105dba5f36ac104e6d21"} Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.819063 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.822649 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8fe907f-2579-491e-95b8-a71e264e9ed4","Type":"ContainerDied","Data":"a74a3f8f1ed2465766d134c4260414baf3f2db3973f061d25e25714b0f2c6e14"} Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.822714 4739 scope.go:117] "RemoveContainer" containerID="5bea2606f8eeb8c0401c72f82d3ad931869a77e9919053abcd5ce20dd567d7d4" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.822742 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.847323 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56dcfd46c8-rpb55" podStartSLOduration=6.847294971 podStartE2EDuration="6.847294971s" podCreationTimestamp="2025-10-08 22:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:38.845058306 +0000 UTC m=+1218.670444096" watchObservedRunningTime="2025-10-08 22:08:38.847294971 +0000 UTC m=+1218.672680761" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.911297 4739 scope.go:117] "RemoveContainer" containerID="fc001940f5597bf6843b72765a794c9af33a7163075ae447bad6a3cfe28395cd" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.925884 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.972423 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.976590 4739 scope.go:117] "RemoveContainer" containerID="93a0d9d2a91e7922ca3f82a71f6212f214805f0ca1d86712b249a96acfd61d74" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.990367 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:38 crc kubenswrapper[4739]: E1008 22:08:38.994517 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="proxy-httpd" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.994656 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="proxy-httpd" Oct 08 22:08:38 crc kubenswrapper[4739]: E1008 22:08:38.994744 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="sg-core" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.994801 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="sg-core" Oct 08 22:08:38 crc kubenswrapper[4739]: E1008 22:08:38.994875 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-central-agent" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.994930 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-central-agent" Oct 08 22:08:38 crc kubenswrapper[4739]: E1008 22:08:38.994995 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-notification-agent" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.995053 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-notification-agent" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.995964 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-notification-agent" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.996047 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="proxy-httpd" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.996125 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="sg-core" Oct 08 22:08:38 crc kubenswrapper[4739]: I1008 22:08:38.996221 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" containerName="ceilometer-central-agent" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.009106 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.012941 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.013281 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.020688 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.065356 4739 scope.go:117] "RemoveContainer" containerID="2105c5cf04f4b7bf905a864df4b0cafe0cc6629d89dc230faa9c2b64ba6009b5" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110664 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t62c\" (UniqueName: \"kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110757 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110792 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110882 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110911 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110939 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.110986 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213727 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213774 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213805 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213858 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213892 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t62c\" (UniqueName: \"kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213941 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.213969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.214453 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.214719 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.220828 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.224107 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.225312 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.232897 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.234060 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t62c\" (UniqueName: \"kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c\") pod \"ceilometer-0\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.367624 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:08:39 crc kubenswrapper[4739]: W1008 22:08:39.908941 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73b1ac2_0e17_4f62_8e4e_e6c41517e2e2.slice/crio-36dc3c4aceab17e0609bf4c8ff4cd6224adcbd3da6c07d3d16a10530af314aaa WatchSource:0}: Error finding container 36dc3c4aceab17e0609bf4c8ff4cd6224adcbd3da6c07d3d16a10530af314aaa: Status 404 returned error can't find the container with id 36dc3c4aceab17e0609bf4c8ff4cd6224adcbd3da6c07d3d16a10530af314aaa Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.913092 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fe907f-2579-491e-95b8-a71e264e9ed4" path="/var/lib/kubelet/pods/d8fe907f-2579-491e-95b8-a71e264e9ed4/volumes" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.917222 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" podStartSLOduration=9.917195477 podStartE2EDuration="9.917195477s" podCreationTimestamp="2025-10-08 22:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:39.905389126 +0000 UTC m=+1219.730774876" watchObservedRunningTime="2025-10-08 22:08:39.917195477 +0000 UTC m=+1219.742581227" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988709 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988762 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988817 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66b4c9b85f-r8lds" event={"ID":"1802465b-168a-449f-b8db-224a426d90ad","Type":"ContainerStarted","Data":"4efb7e0d5dc7a94bcf56e2e3f18216867eeb6fa03fd1c3e30e344acb2d1f4480"} Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988846 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-66b4c9b85f-r8lds" event={"ID":"1802465b-168a-449f-b8db-224a426d90ad","Type":"ContainerStarted","Data":"d5bc30a80c2b8fa7a38e8863c3d98bec4532fb8035664fd70835edb7ea94fa34"} Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988862 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" event={"ID":"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94","Type":"ContainerStarted","Data":"56f05ab983a030ce6abab145276083c1dd12597b4e5cddb3a83463c1e34354fc"} Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988875 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" event={"ID":"a8e9e5fe-49e3-4fea-8a8e-b853c479ce94","Type":"ContainerStarted","Data":"f4bbbc86ba0e359765e216ef771bfdeabbfdcfd272fda3e053470b431915c066"} Oct 08 22:08:39 crc kubenswrapper[4739]: I1008 22:08:39.988885 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerStarted","Data":"8e7685c230eb43484652a01ab584bd7ea2118cffc45ce73191a19d84fc060670"} Oct 08 22:08:40 crc kubenswrapper[4739]: I1008 22:08:40.726290 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:40 crc kubenswrapper[4739]: I1008 22:08:40.887247 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerStarted","Data":"36dc3c4aceab17e0609bf4c8ff4cd6224adcbd3da6c07d3d16a10530af314aaa"} Oct 08 22:08:40 crc kubenswrapper[4739]: I1008 22:08:40.944943 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-66b4c9b85f-r8lds" podStartSLOduration=3.232844098 podStartE2EDuration="10.944905823s" podCreationTimestamp="2025-10-08 22:08:30 +0000 UTC" firstStartedPulling="2025-10-08 22:08:31.21295653 +0000 UTC m=+1211.038342280" lastFinishedPulling="2025-10-08 22:08:38.925018255 +0000 UTC m=+1218.750404005" observedRunningTime="2025-10-08 22:08:40.912878895 +0000 UTC m=+1220.738264665" watchObservedRunningTime="2025-10-08 22:08:40.944905823 +0000 UTC m=+1220.770291583" Oct 08 22:08:40 crc kubenswrapper[4739]: I1008 22:08:40.950862 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-648fb84fdb-qmfb7" podStartSLOduration=3.279023707 podStartE2EDuration="10.95085138s" podCreationTimestamp="2025-10-08 22:08:30 +0000 UTC" firstStartedPulling="2025-10-08 22:08:31.240519059 +0000 UTC m=+1211.065904809" lastFinishedPulling="2025-10-08 22:08:38.912346732 +0000 UTC m=+1218.737732482" observedRunningTime="2025-10-08 22:08:40.945862067 +0000 UTC m=+1220.771247817" watchObservedRunningTime="2025-10-08 22:08:40.95085138 +0000 UTC m=+1220.776237140" Oct 08 22:08:42 crc kubenswrapper[4739]: I1008 22:08:42.396271 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:42 crc kubenswrapper[4739]: I1008 22:08:42.634123 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:42 crc kubenswrapper[4739]: I1008 22:08:42.853397 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:43 crc kubenswrapper[4739]: I1008 22:08:43.295054 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-557d86854-5qdxs" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:08:43 crc kubenswrapper[4739]: I1008 22:08:43.928292 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerStarted","Data":"980b204f8c3826b3a9a0d169e801d9838a95431689b81f9b4dafb886464eed2d"} Oct 08 22:08:44 crc kubenswrapper[4739]: I1008 22:08:44.818054 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56dcfd46c8-rpb55" Oct 08 22:08:44 crc kubenswrapper[4739]: I1008 22:08:44.937298 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:44 crc kubenswrapper[4739]: I1008 22:08:44.937791 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-557d86854-5qdxs" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" containerID="cri-o://67942d1cbdb84656d44a43fcf4a56238e8e51b874b6f72458474e1b52f9b8123" gracePeriod=30 Oct 08 22:08:44 crc kubenswrapper[4739]: I1008 22:08:44.937881 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-557d86854-5qdxs" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api" containerID="cri-o://60b3279bde484979e02287ebf593141e1e965aa13e3bdd6fa962100197c9a2cd" gracePeriod=30 Oct 08 22:08:44 crc kubenswrapper[4739]: I1008 22:08:44.959754 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerStarted","Data":"d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5"} Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.727444 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.794482 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.794779 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="dnsmasq-dns" containerID="cri-o://2648bf9f73403d265860e283a99f9bd21b7511fcc32aa436c08be2965917ee54" gracePeriod=10 Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.983242 4739 generic.go:334] "Generic (PLEG): container finished" podID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerID="2648bf9f73403d265860e283a99f9bd21b7511fcc32aa436c08be2965917ee54" exitCode=0 Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.983348 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" event={"ID":"3acc7290-6140-4870-a0d7-3bed7ac2b601","Type":"ContainerDied","Data":"2648bf9f73403d265860e283a99f9bd21b7511fcc32aa436c08be2965917ee54"} Oct 08 22:08:45 crc kubenswrapper[4739]: I1008 22:08:45.994793 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerStarted","Data":"bfe77dbb7323c53ba09c65c651c9fbda4042f4b7dd0dfea3371be5cf11896d43"} Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.008371 4739 generic.go:334] "Generic (PLEG): container finished" podID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerID="67942d1cbdb84656d44a43fcf4a56238e8e51b874b6f72458474e1b52f9b8123" exitCode=143 Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.008426 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerDied","Data":"67942d1cbdb84656d44a43fcf4a56238e8e51b874b6f72458474e1b52f9b8123"} Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.358096 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477462 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477631 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477748 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqcr4\" (UniqueName: \"kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477801 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477875 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.477922 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb\") pod \"3acc7290-6140-4870-a0d7-3bed7ac2b601\" (UID: \"3acc7290-6140-4870-a0d7-3bed7ac2b601\") " Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.485262 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4" (OuterVolumeSpecName: "kube-api-access-rqcr4") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "kube-api-access-rqcr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.541922 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.546779 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.548605 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config" (OuterVolumeSpecName: "config") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.581555 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.581603 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.581618 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqcr4\" (UniqueName: \"kubernetes.io/projected/3acc7290-6140-4870-a0d7-3bed7ac2b601-kube-api-access-rqcr4\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.581634 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.584434 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.602225 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3acc7290-6140-4870-a0d7-3bed7ac2b601" (UID: "3acc7290-6140-4870-a0d7-3bed7ac2b601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.683954 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:46 crc kubenswrapper[4739]: I1008 22:08:46.683999 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3acc7290-6140-4870-a0d7-3bed7ac2b601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.017796 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" event={"ID":"3acc7290-6140-4870-a0d7-3bed7ac2b601","Type":"ContainerDied","Data":"996b62a3f42ea9a67bd41116670c0c2219eb143a77e7dfc52d2a18c3f7037a55"} Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.017846 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-f5djz" Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.017887 4739 scope.go:117] "RemoveContainer" containerID="2648bf9f73403d265860e283a99f9bd21b7511fcc32aa436c08be2965917ee54" Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.042752 4739 scope.go:117] "RemoveContainer" containerID="b3bfa9a9e3a072bb275aeab0302013654f9915b28fb334f5fc6ad4758ea004c5" Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.050474 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.060690 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-f5djz"] Oct 08 22:08:47 crc kubenswrapper[4739]: I1008 22:08:47.841654 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" path="/var/lib/kubelet/pods/3acc7290-6140-4870-a0d7-3bed7ac2b601/volumes" Oct 08 22:08:48 crc kubenswrapper[4739]: I1008 22:08:48.128532 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-557d86854-5qdxs" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:53526->10.217.0.165:9311: read: connection reset by peer" Oct 08 22:08:48 crc kubenswrapper[4739]: I1008 22:08:48.128610 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-557d86854-5qdxs" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:53532->10.217.0.165:9311: read: connection reset by peer" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.076180 4739 generic.go:334] "Generic (PLEG): container finished" podID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerID="60b3279bde484979e02287ebf593141e1e965aa13e3bdd6fa962100197c9a2cd" exitCode=0 Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.076271 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerDied","Data":"60b3279bde484979e02287ebf593141e1e965aa13e3bdd6fa962100197c9a2cd"} Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.089294 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.089365 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c88568bb8-rh6ln" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.164736 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.242658 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data\") pod \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.242906 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle\") pod \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.242991 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jknvx\" (UniqueName: \"kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx\") pod \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.243066 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom\") pod \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.243198 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs\") pod \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\" (UID: \"1f8a80f3-ff67-4b19-86ee-82a5198d860a\") " Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.244170 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs" (OuterVolumeSpecName: "logs") pod "1f8a80f3-ff67-4b19-86ee-82a5198d860a" (UID: "1f8a80f3-ff67-4b19-86ee-82a5198d860a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.253774 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f8a80f3-ff67-4b19-86ee-82a5198d860a" (UID: "1f8a80f3-ff67-4b19-86ee-82a5198d860a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.253820 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx" (OuterVolumeSpecName: "kube-api-access-jknvx") pod "1f8a80f3-ff67-4b19-86ee-82a5198d860a" (UID: "1f8a80f3-ff67-4b19-86ee-82a5198d860a"). InnerVolumeSpecName "kube-api-access-jknvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.333287 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8a80f3-ff67-4b19-86ee-82a5198d860a" (UID: "1f8a80f3-ff67-4b19-86ee-82a5198d860a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.362521 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.362564 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jknvx\" (UniqueName: \"kubernetes.io/projected/1f8a80f3-ff67-4b19-86ee-82a5198d860a-kube-api-access-jknvx\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.362580 4739 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.362593 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8a80f3-ff67-4b19-86ee-82a5198d860a-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.377435 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data" (OuterVolumeSpecName: "config-data") pod "1f8a80f3-ff67-4b19-86ee-82a5198d860a" (UID: "1f8a80f3-ff67-4b19-86ee-82a5198d860a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.438253 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55468d9c4f-z8pn5" Oct 08 22:08:49 crc kubenswrapper[4739]: I1008 22:08:49.464465 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8a80f3-ff67-4b19-86ee-82a5198d860a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.087019 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerStarted","Data":"985befba2e91f8556a1510eb838d925f9b4f9eadc61516d694d9259539f32e9f"} Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.087416 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.089924 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-557d86854-5qdxs" event={"ID":"1f8a80f3-ff67-4b19-86ee-82a5198d860a","Type":"ContainerDied","Data":"ae264a9960193bc861b6e37d4812bd1bcbb4c598ad7b6b4b78677a2c050e9474"} Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.089956 4739 scope.go:117] "RemoveContainer" containerID="60b3279bde484979e02287ebf593141e1e965aa13e3bdd6fa962100197c9a2cd" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.090074 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-557d86854-5qdxs" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.121238 4739 scope.go:117] "RemoveContainer" containerID="67942d1cbdb84656d44a43fcf4a56238e8e51b874b6f72458474e1b52f9b8123" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.129040 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.132453722 podStartE2EDuration="12.129025257s" podCreationTimestamp="2025-10-08 22:08:38 +0000 UTC" firstStartedPulling="2025-10-08 22:08:39.91326405 +0000 UTC m=+1219.738649800" lastFinishedPulling="2025-10-08 22:08:48.909835585 +0000 UTC m=+1228.735221335" observedRunningTime="2025-10-08 22:08:50.122411804 +0000 UTC m=+1229.947797554" watchObservedRunningTime="2025-10-08 22:08:50.129025257 +0000 UTC m=+1229.954411007" Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.150267 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.152751 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-557d86854-5qdxs"] Oct 08 22:08:50 crc kubenswrapper[4739]: I1008 22:08:50.963205 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:08:51 crc kubenswrapper[4739]: I1008 22:08:51.765859 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:08:51 crc kubenswrapper[4739]: I1008 22:08:51.766526 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:08:51 crc kubenswrapper[4739]: I1008 22:08:51.844374 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" path="/var/lib/kubelet/pods/1f8a80f3-ff67-4b19-86ee-82a5198d860a/volumes" Oct 08 22:08:52 crc kubenswrapper[4739]: I1008 22:08:52.867091 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dc6d4cfc5-7ks2h" Oct 08 22:08:52 crc kubenswrapper[4739]: I1008 22:08:52.948398 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:08:52 crc kubenswrapper[4739]: I1008 22:08:52.948753 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cf7957586-49vcm" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-api" containerID="cri-o://48066fd56d175a613e4a0fe33a8e50ae919b1b5b7746e611b87b000561c2bf00" gracePeriod=30 Oct 08 22:08:52 crc kubenswrapper[4739]: I1008 22:08:52.949574 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-cf7957586-49vcm" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-httpd" containerID="cri-o://9bf91a6876d628402308360b4414bb932f05d5681ecf00e0a542a9c13abe57e0" gracePeriod=30 Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.984590 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 22:08:53 crc kubenswrapper[4739]: E1008 22:08:53.985484 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985503 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" Oct 08 22:08:53 crc kubenswrapper[4739]: E1008 22:08:53.985515 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="init" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985523 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="init" Oct 08 22:08:53 crc kubenswrapper[4739]: E1008 22:08:53.985544 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="dnsmasq-dns" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985552 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="dnsmasq-dns" Oct 08 22:08:53 crc kubenswrapper[4739]: E1008 22:08:53.985575 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985580 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985770 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acc7290-6140-4870-a0d7-3bed7ac2b601" containerName="dnsmasq-dns" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985790 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api-log" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.985804 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8a80f3-ff67-4b19-86ee-82a5198d860a" containerName="barbican-api" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.987906 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.995597 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.995603 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 22:08:53 crc kubenswrapper[4739]: I1008 22:08:53.995807 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rv69g" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.000332 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.054523 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.054617 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.054650 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.054838 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fqt5\" (UniqueName: \"kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.132790 4739 generic.go:334] "Generic (PLEG): container finished" podID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" containerID="cfaea9716ef442a82d7d1442e5293c3a41541f51fe6a306462de76aad7abb370" exitCode=0 Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.132855 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vvmhx" event={"ID":"4da3e49a-b4ae-4375-893f-47d64b4eb0b5","Type":"ContainerDied","Data":"cfaea9716ef442a82d7d1442e5293c3a41541f51fe6a306462de76aad7abb370"} Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.135689 4739 generic.go:334] "Generic (PLEG): container finished" podID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerID="9bf91a6876d628402308360b4414bb932f05d5681ecf00e0a542a9c13abe57e0" exitCode=0 Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.135717 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerDied","Data":"9bf91a6876d628402308360b4414bb932f05d5681ecf00e0a542a9c13abe57e0"} Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.161073 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fqt5\" (UniqueName: \"kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.161439 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.161601 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.161630 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.162300 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.168563 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.169575 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.182222 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fqt5\" (UniqueName: \"kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5\") pod \"openstackclient\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.320595 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.794448 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.902403 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c89ccbcd7-dlxrn"] Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.903915 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.906835 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.907906 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.908262 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.918679 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c89ccbcd7-dlxrn"] Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.980683 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-log-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.980759 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-internal-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.980923 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-combined-ca-bundle\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.980968 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-run-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.981088 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82th4\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-kube-api-access-82th4\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.981165 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-etc-swift\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.981223 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-config-data\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:54 crc kubenswrapper[4739]: I1008 22:08:54.981400 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-public-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.083925 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-combined-ca-bundle\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.083982 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-run-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084048 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82th4\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-kube-api-access-82th4\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084084 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-etc-swift\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084120 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-config-data\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084159 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-public-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084240 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-log-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084275 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-internal-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.084850 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-run-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.085426 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04e3fccb-ef13-4d04-9310-e1aec36adefe-log-httpd\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.093656 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-etc-swift\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.094109 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-config-data\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.095432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-public-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.102629 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-internal-tls-certs\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.103217 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e3fccb-ef13-4d04-9310-e1aec36adefe-combined-ca-bundle\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.106679 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82th4\" (UniqueName: \"kubernetes.io/projected/04e3fccb-ef13-4d04-9310-e1aec36adefe-kube-api-access-82th4\") pod \"swift-proxy-5c89ccbcd7-dlxrn\" (UID: \"04e3fccb-ef13-4d04-9310-e1aec36adefe\") " pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.152221 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5","Type":"ContainerStarted","Data":"6266e0698c625121990a82f4a14fa966402a726dc0b81fefc8182353f4a23046"} Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.244587 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.660928 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803244 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29mk\" (UniqueName: \"kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803342 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803400 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803468 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803577 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.803631 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id\") pod \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\" (UID: \"4da3e49a-b4ae-4375-893f-47d64b4eb0b5\") " Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.804017 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.812299 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts" (OuterVolumeSpecName: "scripts") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.812361 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk" (OuterVolumeSpecName: "kube-api-access-g29mk") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "kube-api-access-g29mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.812395 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.856968 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.885987 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data" (OuterVolumeSpecName: "config-data") pod "4da3e49a-b4ae-4375-893f-47d64b4eb0b5" (UID: "4da3e49a-b4ae-4375-893f-47d64b4eb0b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.909963 4739 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.910022 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g29mk\" (UniqueName: \"kubernetes.io/projected/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-kube-api-access-g29mk\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.910036 4739 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.910046 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.910057 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.910069 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da3e49a-b4ae-4375-893f-47d64b4eb0b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:08:55 crc kubenswrapper[4739]: I1008 22:08:55.947608 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c89ccbcd7-dlxrn"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.182005 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" event={"ID":"04e3fccb-ef13-4d04-9310-e1aec36adefe","Type":"ContainerStarted","Data":"3d3c965a7b615aa7ca6f0dc5a99dc6bb46ad0ad76dd062b60897f3aa5d56e0ab"} Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.185482 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vvmhx" event={"ID":"4da3e49a-b4ae-4375-893f-47d64b4eb0b5","Type":"ContainerDied","Data":"185ce260bf266eccdbfd85ba074228ae9fe027e42f58c642b5fb887d97480db4"} Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.185530 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185ce260bf266eccdbfd85ba074228ae9fe027e42f58c642b5fb887d97480db4" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.185588 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vvmhx" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.489050 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:08:56 crc kubenswrapper[4739]: E1008 22:08:56.498896 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" containerName="cinder-db-sync" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.498939 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" containerName="cinder-db-sync" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.499220 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" containerName="cinder-db-sync" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.500638 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.505056 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.505385 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q6fzs" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.505588 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.506114 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.527687 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.618817 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.620637 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.635811 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.635858 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5md2\" (UniqueName: \"kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.635898 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.635924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.635955 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.636021 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.657659 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738335 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738469 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738583 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738756 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738892 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.738936 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5md2\" (UniqueName: \"kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739015 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739071 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739208 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739232 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbwp\" (UniqueName: \"kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739288 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739454 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739502 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.739644 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.742002 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.747127 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.755336 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.755778 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.756016 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.762624 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5md2\" (UniqueName: \"kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.764561 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.770297 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841046 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841101 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841133 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbwp\" (UniqueName: \"kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841182 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841219 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841243 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841290 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841316 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841337 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtqx\" (UniqueName: \"kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841364 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841392 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841440 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.841488 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.842683 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.843264 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.843764 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.844898 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.845417 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.863118 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbwp\" (UniqueName: \"kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp\") pod \"dnsmasq-dns-5784cf869f-tfvvh\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.917761 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943569 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtqx\" (UniqueName: \"kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943653 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943727 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943788 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943823 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943851 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.943887 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.944441 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.947543 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.949707 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.955196 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.955231 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.956593 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.990888 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtqx\" (UniqueName: \"kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx\") pod \"cinder-api-0\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " pod="openstack/cinder-api-0" Oct 08 22:08:56 crc kubenswrapper[4739]: I1008 22:08:56.995757 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.130802 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.175515 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.175826 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-central-agent" containerID="cri-o://980b204f8c3826b3a9a0d169e801d9838a95431689b81f9b4dafb886464eed2d" gracePeriod=30 Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.176329 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="proxy-httpd" containerID="cri-o://985befba2e91f8556a1510eb838d925f9b4f9eadc61516d694d9259539f32e9f" gracePeriod=30 Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.176395 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="sg-core" containerID="cri-o://bfe77dbb7323c53ba09c65c651c9fbda4042f4b7dd0dfea3371be5cf11896d43" gracePeriod=30 Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.176449 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-notification-agent" containerID="cri-o://d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5" gracePeriod=30 Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.231847 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" event={"ID":"04e3fccb-ef13-4d04-9310-e1aec36adefe","Type":"ContainerStarted","Data":"06b03aa7a4af0939484aa8c77926016253c5c2c3e842e1984e7220dff0dab8f5"} Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.231892 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" event={"ID":"04e3fccb-ef13-4d04-9310-e1aec36adefe","Type":"ContainerStarted","Data":"ddf0b4d5b1064c62ac06c77bf6587969236a821c0c281e807360eff168a06048"} Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.233337 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.233375 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.548891 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" podStartSLOduration=3.548866736 podStartE2EDuration="3.548866736s" podCreationTimestamp="2025-10-08 22:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:08:57.27207097 +0000 UTC m=+1237.097456720" watchObservedRunningTime="2025-10-08 22:08:57.548866736 +0000 UTC m=+1237.374252486" Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.552564 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.699226 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:08:57 crc kubenswrapper[4739]: W1008 22:08:57.721573 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13865e1_4812_4d07_8b31_e488ab164399.slice/crio-3992a95efa53abdb0a55ea796466ff860b076db04b890e459acc53d700cd762e WatchSource:0}: Error finding container 3992a95efa53abdb0a55ea796466ff860b076db04b890e459acc53d700cd762e: Status 404 returned error can't find the container with id 3992a95efa53abdb0a55ea796466ff860b076db04b890e459acc53d700cd762e Oct 08 22:08:57 crc kubenswrapper[4739]: W1008 22:08:57.842495 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc21e15c2_4676_4335_bf2a_6cbbe6052a91.slice/crio-f28ec2d71b74251e05c74b05a8ebd56d8928f50dc0ff6c0a54e8ada1505e966b WatchSource:0}: Error finding container f28ec2d71b74251e05c74b05a8ebd56d8928f50dc0ff6c0a54e8ada1505e966b: Status 404 returned error can't find the container with id f28ec2d71b74251e05c74b05a8ebd56d8928f50dc0ff6c0a54e8ada1505e966b Oct 08 22:08:57 crc kubenswrapper[4739]: I1008 22:08:57.861652 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.251640 4739 generic.go:334] "Generic (PLEG): container finished" podID="b13865e1-4812-4d07-8b31-e488ab164399" containerID="360281c6c8de55f2175d23d9e89fd27ac51a8066ce237b9fded99cf9adcfd425" exitCode=0 Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.252238 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" event={"ID":"b13865e1-4812-4d07-8b31-e488ab164399","Type":"ContainerDied","Data":"360281c6c8de55f2175d23d9e89fd27ac51a8066ce237b9fded99cf9adcfd425"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.252273 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" event={"ID":"b13865e1-4812-4d07-8b31-e488ab164399","Type":"ContainerStarted","Data":"3992a95efa53abdb0a55ea796466ff860b076db04b890e459acc53d700cd762e"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.259224 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerStarted","Data":"1baf9d660b576839912d35ac02ea2da9dabaccfaba258148211ef06958274904"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.262284 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerStarted","Data":"f28ec2d71b74251e05c74b05a8ebd56d8928f50dc0ff6c0a54e8ada1505e966b"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.266593 4739 generic.go:334] "Generic (PLEG): container finished" podID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerID="985befba2e91f8556a1510eb838d925f9b4f9eadc61516d694d9259539f32e9f" exitCode=0 Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.266622 4739 generic.go:334] "Generic (PLEG): container finished" podID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerID="bfe77dbb7323c53ba09c65c651c9fbda4042f4b7dd0dfea3371be5cf11896d43" exitCode=2 Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.266630 4739 generic.go:334] "Generic (PLEG): container finished" podID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerID="980b204f8c3826b3a9a0d169e801d9838a95431689b81f9b4dafb886464eed2d" exitCode=0 Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.267573 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerDied","Data":"985befba2e91f8556a1510eb838d925f9b4f9eadc61516d694d9259539f32e9f"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.267637 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerDied","Data":"bfe77dbb7323c53ba09c65c651c9fbda4042f4b7dd0dfea3371be5cf11896d43"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.267648 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerDied","Data":"980b204f8c3826b3a9a0d169e801d9838a95431689b81f9b4dafb886464eed2d"} Oct 08 22:08:58 crc kubenswrapper[4739]: I1008 22:08:58.934220 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:00 crc kubenswrapper[4739]: I1008 22:09:00.293647 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" event={"ID":"b13865e1-4812-4d07-8b31-e488ab164399","Type":"ContainerStarted","Data":"4c42d0d8942aac862acb37940099ee81c043d7186ac7a77dcde4379b187bc350"} Oct 08 22:09:00 crc kubenswrapper[4739]: I1008 22:09:00.294073 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:09:00 crc kubenswrapper[4739]: I1008 22:09:00.301927 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerStarted","Data":"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c"} Oct 08 22:09:00 crc kubenswrapper[4739]: I1008 22:09:00.328064 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" podStartSLOduration=4.328032081 podStartE2EDuration="4.328032081s" podCreationTimestamp="2025-10-08 22:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:00.314544359 +0000 UTC m=+1240.139930109" watchObservedRunningTime="2025-10-08 22:09:00.328032081 +0000 UTC m=+1240.153417831" Oct 08 22:09:01 crc kubenswrapper[4739]: E1008 22:09:01.255916 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73b1ac2_0e17_4f62_8e4e_e6c41517e2e2.slice/crio-conmon-d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.349725 4739 generic.go:334] "Generic (PLEG): container finished" podID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerID="d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5" exitCode=0 Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.349789 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerDied","Data":"d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5"} Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.353964 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerStarted","Data":"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5"} Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.354021 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api-log" containerID="cri-o://d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" gracePeriod=30 Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.354227 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api" containerID="cri-o://ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" gracePeriod=30 Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.354346 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.371595 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.371573428 podStartE2EDuration="5.371573428s" podCreationTimestamp="2025-10-08 22:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:01.369885906 +0000 UTC m=+1241.195271656" watchObservedRunningTime="2025-10-08 22:09:01.371573428 +0000 UTC m=+1241.196959178" Oct 08 22:09:01 crc kubenswrapper[4739]: I1008 22:09:01.980413 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.060962 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.117803 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.117866 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t62c\" (UniqueName: \"kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.117914 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.117967 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.117992 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118031 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118092 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rtqx\" (UniqueName: \"kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118116 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118136 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118173 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118201 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118215 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118286 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts\") pod \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\" (UID: \"c21e15c2-4676-4335-bf2a-6cbbe6052a91\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.118303 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml\") pod \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\" (UID: \"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2\") " Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.124222 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.124568 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.125445 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx" (OuterVolumeSpecName: "kube-api-access-7rtqx") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "kube-api-access-7rtqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.127602 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs" (OuterVolumeSpecName: "logs") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.127685 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.130833 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.132668 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts" (OuterVolumeSpecName: "scripts") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.153471 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts" (OuterVolumeSpecName: "scripts") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.153628 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c" (OuterVolumeSpecName: "kube-api-access-4t62c") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "kube-api-access-4t62c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221340 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c21e15c2-4676-4335-bf2a-6cbbe6052a91-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221371 4739 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221381 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221389 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221397 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t62c\" (UniqueName: \"kubernetes.io/projected/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-kube-api-access-4t62c\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221405 4739 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c21e15c2-4676-4335-bf2a-6cbbe6052a91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221414 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221422 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.221430 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rtqx\" (UniqueName: \"kubernetes.io/projected/c21e15c2-4676-4335-bf2a-6cbbe6052a91-kube-api-access-7rtqx\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.249526 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.275384 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.308311 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data" (OuterVolumeSpecName: "config-data") pod "c21e15c2-4676-4335-bf2a-6cbbe6052a91" (UID: "c21e15c2-4676-4335-bf2a-6cbbe6052a91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.319806 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.323261 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.323289 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21e15c2-4676-4335-bf2a-6cbbe6052a91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.323301 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.323314 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.336316 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data" (OuterVolumeSpecName: "config-data") pod "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" (UID: "a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.426335 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.427441 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2","Type":"ContainerDied","Data":"36dc3c4aceab17e0609bf4c8ff4cd6224adcbd3da6c07d3d16a10530af314aaa"} Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.427481 4739 scope.go:117] "RemoveContainer" containerID="985befba2e91f8556a1510eb838d925f9b4f9eadc61516d694d9259539f32e9f" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.427630 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476340 4739 generic.go:334] "Generic (PLEG): container finished" podID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerID="ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" exitCode=0 Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476599 4739 generic.go:334] "Generic (PLEG): container finished" podID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerID="d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" exitCode=143 Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476597 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerDied","Data":"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5"} Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerDied","Data":"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c"} Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476801 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c21e15c2-4676-4335-bf2a-6cbbe6052a91","Type":"ContainerDied","Data":"f28ec2d71b74251e05c74b05a8ebd56d8928f50dc0ff6c0a54e8ada1505e966b"} Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.476582 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.497198 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.503729 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.544319 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.544895 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="sg-core" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.544989 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="sg-core" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.545054 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-notification-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545102 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-notification-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.545185 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api-log" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545248 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api-log" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.545303 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545351 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.545408 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="proxy-httpd" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545480 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="proxy-httpd" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.545539 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-central-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545587 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-central-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545806 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-notification-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545914 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="sg-core" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.545971 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="proxy-httpd" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.546024 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api-log" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.546082 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" containerName="cinder-api" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.546175 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" containerName="ceilometer-central-agent" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.552267 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.567911 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.568265 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.578457 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631655 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631714 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631741 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfvdl\" (UniqueName: \"kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631772 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631790 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631878 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.631906 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.641342 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.646550 4739 scope.go:117] "RemoveContainer" containerID="bfe77dbb7323c53ba09c65c651c9fbda4042f4b7dd0dfea3371be5cf11896d43" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.663455 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.672961 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.681582 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.681713 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.685541 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.685757 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.685862 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.688175 4739 scope.go:117] "RemoveContainer" containerID="d479005bfb8a5144571bd125be4d03c5cbdb3a9203eb942815e4c447829826d5" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.726525 4739 scope.go:117] "RemoveContainer" containerID="980b204f8c3826b3a9a0d169e801d9838a95431689b81f9b4dafb886464eed2d" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.737770 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.737821 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.737842 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.737860 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-logs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.737972 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738219 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wbg\" (UniqueName: \"kubernetes.io/projected/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-kube-api-access-b7wbg\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738284 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738327 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738718 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738773 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.738919 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.739293 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.739428 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.739773 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfvdl\" (UniqueName: \"kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.739886 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.739906 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.741379 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-scripts\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.741840 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.744570 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.744776 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.745828 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.747988 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.757022 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfvdl\" (UniqueName: \"kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl\") pod \"ceilometer-0\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.841886 4739 scope.go:117] "RemoveContainer" containerID="ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843813 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843885 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843928 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-scripts\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843958 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843979 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.843995 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.844013 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-logs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.844046 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.844081 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wbg\" (UniqueName: \"kubernetes.io/projected/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-kube-api-access-b7wbg\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.844927 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-logs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.845001 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.848426 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.850445 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.850948 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.851168 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.852432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.864791 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wbg\" (UniqueName: \"kubernetes.io/projected/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-kube-api-access-b7wbg\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.871806 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9-scripts\") pod \"cinder-api-0\" (UID: \"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9\") " pod="openstack/cinder-api-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.904582 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.955910 4739 scope.go:117] "RemoveContainer" containerID="d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.988389 4739 scope.go:117] "RemoveContainer" containerID="ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.988899 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5\": container with ID starting with ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5 not found: ID does not exist" containerID="ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.988962 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5"} err="failed to get container status \"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5\": rpc error: code = NotFound desc = could not find container \"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5\": container with ID starting with ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5 not found: ID does not exist" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.989007 4739 scope.go:117] "RemoveContainer" containerID="d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" Oct 08 22:09:02 crc kubenswrapper[4739]: E1008 22:09:02.990026 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c\": container with ID starting with d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c not found: ID does not exist" containerID="d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.990057 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c"} err="failed to get container status \"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c\": rpc error: code = NotFound desc = could not find container \"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c\": container with ID starting with d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c not found: ID does not exist" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.990081 4739 scope.go:117] "RemoveContainer" containerID="ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.990440 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5"} err="failed to get container status \"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5\": rpc error: code = NotFound desc = could not find container \"ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5\": container with ID starting with ceb53030d70d7541abfde236a84670e18b10f89cd23a534a1baedf48373a1bc5 not found: ID does not exist" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.990475 4739 scope.go:117] "RemoveContainer" containerID="d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c" Oct 08 22:09:02 crc kubenswrapper[4739]: I1008 22:09:02.990760 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c"} err="failed to get container status \"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c\": rpc error: code = NotFound desc = could not find container \"d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c\": container with ID starting with d48d264678c3a9a47d63ab56cea47c1f76bd9ebec0ee3e93887ea6f280a7e79c not found: ID does not exist" Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.133107 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.417725 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.536881 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerStarted","Data":"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2"} Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.544433 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerStarted","Data":"f1cf660d6b01df0d9e0186a5a44ab48ae6f3d6033e66f0e4caeb930d9185c88b"} Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.668369 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.835523 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2" path="/var/lib/kubelet/pods/a73b1ac2-0e17-4f62-8e4e-e6c41517e2e2/volumes" Oct 08 22:09:03 crc kubenswrapper[4739]: I1008 22:09:03.836874 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21e15c2-4676-4335-bf2a-6cbbe6052a91" path="/var/lib/kubelet/pods/c21e15c2-4676-4335-bf2a-6cbbe6052a91/volumes" Oct 08 22:09:04 crc kubenswrapper[4739]: I1008 22:09:04.574116 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerStarted","Data":"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f"} Oct 08 22:09:04 crc kubenswrapper[4739]: I1008 22:09:04.593009 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.638844135 podStartE2EDuration="8.592996714s" podCreationTimestamp="2025-10-08 22:08:56 +0000 UTC" firstStartedPulling="2025-10-08 22:08:57.580494045 +0000 UTC m=+1237.405879795" lastFinishedPulling="2025-10-08 22:09:01.534646624 +0000 UTC m=+1241.360032374" observedRunningTime="2025-10-08 22:09:04.5920546 +0000 UTC m=+1244.417440340" watchObservedRunningTime="2025-10-08 22:09:04.592996714 +0000 UTC m=+1244.418382464" Oct 08 22:09:05 crc kubenswrapper[4739]: I1008 22:09:05.257287 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:09:05 crc kubenswrapper[4739]: I1008 22:09:05.257495 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c89ccbcd7-dlxrn" Oct 08 22:09:05 crc kubenswrapper[4739]: I1008 22:09:05.903946 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:06 crc kubenswrapper[4739]: I1008 22:09:06.918281 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 22:09:06 crc kubenswrapper[4739]: I1008 22:09:06.998344 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.060485 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.060811 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="dnsmasq-dns" containerID="cri-o://8e7685c230eb43484652a01ab584bd7ea2118cffc45ce73191a19d84fc060670" gracePeriod=10 Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.614744 4739 generic.go:334] "Generic (PLEG): container finished" podID="439123d3-6874-4694-9790-f3ea65bde3a5" containerID="8e7685c230eb43484652a01ab584bd7ea2118cffc45ce73191a19d84fc060670" exitCode=0 Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.621130 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerDied","Data":"8e7685c230eb43484652a01ab584bd7ea2118cffc45ce73191a19d84fc060670"} Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.627076 4739 generic.go:334] "Generic (PLEG): container finished" podID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerID="48066fd56d175a613e4a0fe33a8e50ae919b1b5b7746e611b87b000561c2bf00" exitCode=0 Oct 08 22:09:07 crc kubenswrapper[4739]: I1008 22:09:07.627125 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerDied","Data":"48066fd56d175a613e4a0fe33a8e50ae919b1b5b7746e611b87b000561c2bf00"} Oct 08 22:09:09 crc kubenswrapper[4739]: I1008 22:09:09.335242 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:09 crc kubenswrapper[4739]: I1008 22:09:09.335721 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-log" containerID="cri-o://092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8" gracePeriod=30 Oct 08 22:09:09 crc kubenswrapper[4739]: I1008 22:09:09.335827 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-httpd" containerID="cri-o://8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc" gracePeriod=30 Oct 08 22:09:09 crc kubenswrapper[4739]: I1008 22:09:09.645522 4739 generic.go:334] "Generic (PLEG): container finished" podID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerID="092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8" exitCode=143 Oct 08 22:09:09 crc kubenswrapper[4739]: I1008 22:09:09.645576 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerDied","Data":"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8"} Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.267486 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.267877 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-httpd" containerID="cri-o://8752ddc99cd48cd577f2eb560e4f1a5194fef80d51dab2eeeb20d1ab2e4bc013" gracePeriod=30 Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.268175 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-log" containerID="cri-o://efac728265e15cbfbd6a69f1dea53010d867fff26f1210e7d9fe3880bd92d2da" gracePeriod=30 Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.524248 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5pbsk"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.525663 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.543731 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5pbsk"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.619197 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vx5hx"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.621252 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.637661 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s485d\" (UniqueName: \"kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d\") pod \"nova-api-db-create-5pbsk\" (UID: \"705882a0-d0a4-490f-9377-d9b379c5a9ea\") " pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.645374 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vx5hx"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.661480 4739 generic.go:334] "Generic (PLEG): container finished" podID="d1240189-197f-4fc9-98a7-538ffdd522da" containerID="efac728265e15cbfbd6a69f1dea53010d867fff26f1210e7d9fe3880bd92d2da" exitCode=143 Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.661532 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerDied","Data":"efac728265e15cbfbd6a69f1dea53010d867fff26f1210e7d9fe3880bd92d2da"} Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.720600 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-89kb6"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.722001 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.726124 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: connect: connection refused" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.734431 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-89kb6"] Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.741337 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpn6c\" (UniqueName: \"kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c\") pod \"nova-cell0-db-create-vx5hx\" (UID: \"d67308b3-9336-448a-9399-7db66f43b5aa\") " pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.741510 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s485d\" (UniqueName: \"kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d\") pod \"nova-api-db-create-5pbsk\" (UID: \"705882a0-d0a4-490f-9377-d9b379c5a9ea\") " pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.780159 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s485d\" (UniqueName: \"kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d\") pod \"nova-api-db-create-5pbsk\" (UID: \"705882a0-d0a4-490f-9377-d9b379c5a9ea\") " pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.843279 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnw4\" (UniqueName: \"kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4\") pod \"nova-cell1-db-create-89kb6\" (UID: \"442932da-356f-4a98-97a8-59ee1418fd24\") " pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.843582 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpn6c\" (UniqueName: \"kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c\") pod \"nova-cell0-db-create-vx5hx\" (UID: \"d67308b3-9336-448a-9399-7db66f43b5aa\") " pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.844190 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.860710 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpn6c\" (UniqueName: \"kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c\") pod \"nova-cell0-db-create-vx5hx\" (UID: \"d67308b3-9336-448a-9399-7db66f43b5aa\") " pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.945306 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.945982 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnw4\" (UniqueName: \"kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4\") pod \"nova-cell1-db-create-89kb6\" (UID: \"442932da-356f-4a98-97a8-59ee1418fd24\") " pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:10 crc kubenswrapper[4739]: I1008 22:09:10.980256 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnw4\" (UniqueName: \"kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4\") pod \"nova-cell1-db-create-89kb6\" (UID: \"442932da-356f-4a98-97a8-59ee1418fd24\") " pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:10 crc kubenswrapper[4739]: W1008 22:09:10.985363 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e3dd3d_1e46_4b2a_b71b_fbfb5d7a17a9.slice/crio-7ba0a334a880635926446c54bcfe9846a61e75e482d7b4be5e32b1e6e94a905b WatchSource:0}: Error finding container 7ba0a334a880635926446c54bcfe9846a61e75e482d7b4be5e32b1e6e94a905b: Status 404 returned error can't find the container with id 7ba0a334a880635926446c54bcfe9846a61e75e482d7b4be5e32b1e6e94a905b Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.040353 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.588125 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665095 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665198 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665246 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665290 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665309 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.665328 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bltf\" (UniqueName: \"kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf\") pod \"439123d3-6874-4694-9790-f3ea65bde3a5\" (UID: \"439123d3-6874-4694-9790-f3ea65bde3a5\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.681268 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.693020 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf" (OuterVolumeSpecName: "kube-api-access-4bltf") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "kube-api-access-4bltf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.736962 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9","Type":"ContainerStarted","Data":"7ba0a334a880635926446c54bcfe9846a61e75e482d7b4be5e32b1e6e94a905b"} Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.750056 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cf7957586-49vcm" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.750056 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cf7957586-49vcm" event={"ID":"7fe20a8d-2c00-4441-92dd-f92da148433b","Type":"ContainerDied","Data":"7fcbe14acfab568bf24ac68fd5233c80947b9541b9785b773ea0c5765c5bdd9a"} Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.750231 4739 scope.go:117] "RemoveContainer" containerID="9bf91a6876d628402308360b4414bb932f05d5681ecf00e0a542a9c13abe57e0" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.757889 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" event={"ID":"439123d3-6874-4694-9790-f3ea65bde3a5","Type":"ContainerDied","Data":"17dd81c64960a088b8523839223940a4b3d5b574cb4f65cb7bbb5ddcbe08903a"} Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.758018 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-ds8pz" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.766765 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config\") pod \"7fe20a8d-2c00-4441-92dd-f92da148433b\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.766838 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config\") pod \"7fe20a8d-2c00-4441-92dd-f92da148433b\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.766899 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs\") pod \"7fe20a8d-2c00-4441-92dd-f92da148433b\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.766990 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle\") pod \"7fe20a8d-2c00-4441-92dd-f92da148433b\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.768998 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wszdx\" (UniqueName: \"kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx\") pod \"7fe20a8d-2c00-4441-92dd-f92da148433b\" (UID: \"7fe20a8d-2c00-4441-92dd-f92da148433b\") " Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.769694 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bltf\" (UniqueName: \"kubernetes.io/projected/439123d3-6874-4694-9790-f3ea65bde3a5-kube-api-access-4bltf\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.777191 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7fe20a8d-2c00-4441-92dd-f92da148433b" (UID: "7fe20a8d-2c00-4441-92dd-f92da148433b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.794049 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx" (OuterVolumeSpecName: "kube-api-access-wszdx") pod "7fe20a8d-2c00-4441-92dd-f92da148433b" (UID: "7fe20a8d-2c00-4441-92dd-f92da148433b"). InnerVolumeSpecName "kube-api-access-wszdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.803704 4739 scope.go:117] "RemoveContainer" containerID="48066fd56d175a613e4a0fe33a8e50ae919b1b5b7746e611b87b000561c2bf00" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.875581 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wszdx\" (UniqueName: \"kubernetes.io/projected/7fe20a8d-2c00-4441-92dd-f92da148433b-kube-api-access-wszdx\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:11 crc kubenswrapper[4739]: I1008 22:09:11.875613 4739 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.007633 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config" (OuterVolumeSpecName: "config") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.028246 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.070862 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.080709 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.080749 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.088509 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.095070 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config" (OuterVolumeSpecName: "config") pod "7fe20a8d-2c00-4441-92dd-f92da148433b" (UID: "7fe20a8d-2c00-4441-92dd-f92da148433b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.112814 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fe20a8d-2c00-4441-92dd-f92da148433b" (UID: "7fe20a8d-2c00-4441-92dd-f92da148433b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.127038 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.160422 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "439123d3-6874-4694-9790-f3ea65bde3a5" (UID: "439123d3-6874-4694-9790-f3ea65bde3a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.182677 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.182716 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.182728 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439123d3-6874-4694-9790-f3ea65bde3a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.182739 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.182823 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.185700 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7fe20a8d-2c00-4441-92dd-f92da148433b" (UID: "7fe20a8d-2c00-4441-92dd-f92da148433b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.283037 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.283356 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5pbsk"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.283447 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-89kb6"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.284287 4739 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fe20a8d-2c00-4441-92dd-f92da148433b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.301631 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vx5hx"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.351366 4739 scope.go:117] "RemoveContainer" containerID="8e7685c230eb43484652a01ab584bd7ea2118cffc45ce73191a19d84fc060670" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.416222 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.453208 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.485254 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-ds8pz"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.494694 4739 scope.go:117] "RemoveContainer" containerID="90a96c604828f6525467ff9c8893643a4f3cd46a097e0f7672d4d059e7a1f6f5" Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.554361 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.605710 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cf7957586-49vcm"] Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.791393 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerStarted","Data":"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53"} Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.798057 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-89kb6" event={"ID":"442932da-356f-4a98-97a8-59ee1418fd24","Type":"ContainerStarted","Data":"56d106e5997a15b6d9f8efc2ded2b34a83f60e2ef5dc14b9487c23f0e9e9b80b"} Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.800421 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5pbsk" event={"ID":"705882a0-d0a4-490f-9377-d9b379c5a9ea","Type":"ContainerStarted","Data":"1cb82bd629a1af8d923f69bd43c7df9d954939d49d6077e6045a399fade29479"} Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.824365 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5","Type":"ContainerStarted","Data":"71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481"} Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.834825 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="cinder-scheduler" containerID="cri-o://1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2" gracePeriod=30 Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.834962 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vx5hx" event={"ID":"d67308b3-9336-448a-9399-7db66f43b5aa","Type":"ContainerStarted","Data":"b30085966d8e0e20fb18187c1a8132ea6b999cd98987465779b039d0d29f3d59"} Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.835010 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="probe" containerID="cri-o://485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f" gracePeriod=30 Oct 08 22:09:12 crc kubenswrapper[4739]: I1008 22:09:12.852895 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.26779039 podStartE2EDuration="19.852878747s" podCreationTimestamp="2025-10-08 22:08:53 +0000 UTC" firstStartedPulling="2025-10-08 22:08:54.8005303 +0000 UTC m=+1234.625916050" lastFinishedPulling="2025-10-08 22:09:11.385618667 +0000 UTC m=+1251.211004407" observedRunningTime="2025-10-08 22:09:12.848077559 +0000 UTC m=+1252.673463299" watchObservedRunningTime="2025-10-08 22:09:12.852878747 +0000 UTC m=+1252.678264497" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.710745 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.710765 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.765725 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.827912 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828039 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828103 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828139 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828207 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828265 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828297 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk5jl\" (UniqueName: \"kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.828335 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts\") pod \"fc637b11-4e12-4a6a-a496-1a700d3756c1\" (UID: \"fc637b11-4e12-4a6a-a496-1a700d3756c1\") " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.834314 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs" (OuterVolumeSpecName: "logs") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.835951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.839099 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl" (OuterVolumeSpecName: "kube-api-access-kk5jl") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "kube-api-access-kk5jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.844139 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" path="/var/lib/kubelet/pods/439123d3-6874-4694-9790-f3ea65bde3a5/volumes" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.845052 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" path="/var/lib/kubelet/pods/7fe20a8d-2c00-4441-92dd-f92da148433b/volumes" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.846370 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts" (OuterVolumeSpecName: "scripts") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.850124 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.868858 4739 generic.go:334] "Generic (PLEG): container finished" podID="705882a0-d0a4-490f-9377-d9b379c5a9ea" containerID="9ad019e7acc4f9d8d37fd4b8c1e574900a10d7353202e4edf792b27dd2d49818" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.874788 4739 generic.go:334] "Generic (PLEG): container finished" podID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerID="8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.874876 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.877796 4739 generic.go:334] "Generic (PLEG): container finished" podID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerID="485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.878960 4739 generic.go:334] "Generic (PLEG): container finished" podID="d67308b3-9336-448a-9399-7db66f43b5aa" containerID="0c00db920a38489ac2087c86988347b6e291b2ea19110d1b86678d226ffe3a1e" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.880762 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.894424 4739 generic.go:334] "Generic (PLEG): container finished" podID="442932da-356f-4a98-97a8-59ee1418fd24" containerID="3cf36dda52dda7ca4486cb1db258872c2e392941ea164018080edbb6c6580e16" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.920276 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935451 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935489 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935500 4739 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935509 4739 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935518 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk5jl\" (UniqueName: \"kubernetes.io/projected/fc637b11-4e12-4a6a-a496-1a700d3756c1-kube-api-access-kk5jl\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935526 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.935534 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc637b11-4e12-4a6a-a496-1a700d3756c1-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.939536 4739 generic.go:334] "Generic (PLEG): container finished" podID="d1240189-197f-4fc9-98a7-538ffdd522da" containerID="8752ddc99cd48cd577f2eb560e4f1a5194fef80d51dab2eeeb20d1ab2e4bc013" exitCode=0 Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.946675 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=11.946657091 podStartE2EDuration="11.946657091s" podCreationTimestamp="2025-10-08 22:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:13.934860991 +0000 UTC m=+1253.760246741" watchObservedRunningTime="2025-10-08 22:09:13.946657091 +0000 UTC m=+1253.772042841" Oct 08 22:09:13 crc kubenswrapper[4739]: I1008 22:09:13.970916 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data" (OuterVolumeSpecName: "config-data") pod "fc637b11-4e12-4a6a-a496-1a700d3756c1" (UID: "fc637b11-4e12-4a6a-a496-1a700d3756c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007127 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007190 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5pbsk" event={"ID":"705882a0-d0a4-490f-9377-d9b379c5a9ea","Type":"ContainerDied","Data":"9ad019e7acc4f9d8d37fd4b8c1e574900a10d7353202e4edf792b27dd2d49818"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007221 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerDied","Data":"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007241 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc637b11-4e12-4a6a-a496-1a700d3756c1","Type":"ContainerDied","Data":"3c6b45b07b850409ed0af4d10714732c555012559f60dfb074cfe4edfbf8018e"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007254 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerDied","Data":"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007268 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vx5hx" event={"ID":"d67308b3-9336-448a-9399-7db66f43b5aa","Type":"ContainerDied","Data":"0c00db920a38489ac2087c86988347b6e291b2ea19110d1b86678d226ffe3a1e"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007283 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9","Type":"ContainerStarted","Data":"8b99eeb5840586973c06cf789deeb1f488d7fad9788fc45e7e3ba7c41ab8b481"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007297 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9","Type":"ContainerStarted","Data":"c30bcef7111602c3a20303e3b9c9f52b0dc35eed8caefbaef260d04d585cd0b4"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007309 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerStarted","Data":"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007320 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-89kb6" event={"ID":"442932da-356f-4a98-97a8-59ee1418fd24","Type":"ContainerDied","Data":"3cf36dda52dda7ca4486cb1db258872c2e392941ea164018080edbb6c6580e16"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007332 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerDied","Data":"8752ddc99cd48cd577f2eb560e4f1a5194fef80d51dab2eeeb20d1ab2e4bc013"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.007352 4739 scope.go:117] "RemoveContainer" containerID="8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.012759 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.037730 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.037765 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc637b11-4e12-4a6a-a496-1a700d3756c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.062027 4739 scope.go:117] "RemoveContainer" containerID="092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.094441 4739 scope.go:117] "RemoveContainer" containerID="8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.097699 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc\": container with ID starting with 8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc not found: ID does not exist" containerID="8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.097746 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc"} err="failed to get container status \"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc\": rpc error: code = NotFound desc = could not find container \"8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc\": container with ID starting with 8d5ad7cd87074ce62fba14a4bb0a9a5fcb5831d2c3e87e69819f05c55ff8e2bc not found: ID does not exist" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.097775 4739 scope.go:117] "RemoveContainer" containerID="092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.098356 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8\": container with ID starting with 092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8 not found: ID does not exist" containerID="092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.098386 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8"} err="failed to get container status \"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8\": rpc error: code = NotFound desc = could not find container \"092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8\": container with ID starting with 092cf9954ec00578e37231da0fc0753387c81d6929ec054a8cc1bd606d0ea2f8 not found: ID does not exist" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.118550 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139312 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztxtr\" (UniqueName: \"kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139388 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139446 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139480 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139532 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139550 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139611 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.139685 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle\") pod \"d1240189-197f-4fc9-98a7-538ffdd522da\" (UID: \"d1240189-197f-4fc9-98a7-538ffdd522da\") " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.142017 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs" (OuterVolumeSpecName: "logs") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.142442 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.151330 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts" (OuterVolumeSpecName: "scripts") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.151344 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr" (OuterVolumeSpecName: "kube-api-access-ztxtr") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "kube-api-access-ztxtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.159339 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.224243 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.228291 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.237647 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242476 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242504 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztxtr\" (UniqueName: \"kubernetes.io/projected/d1240189-197f-4fc9-98a7-538ffdd522da-kube-api-access-ztxtr\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242516 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242526 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242535 4739 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242543 4739 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1240189-197f-4fc9-98a7-538ffdd522da-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.242565 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.254235 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259201 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259597 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="dnsmasq-dns" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259617 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="dnsmasq-dns" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259630 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259639 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259654 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259660 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259673 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="init" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259679 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="init" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259691 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-api" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259697 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-api" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259709 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259714 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259727 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259733 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: E1008 22:09:14.259750 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259755 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259936 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259948 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-api" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259956 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259969 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259983 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" containerName="glance-log" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.259991 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe20a8d-2c00-4441-92dd-f92da148433b" containerName="neutron-httpd" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.260001 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="439123d3-6874-4694-9790-f3ea65bde3a5" containerName="dnsmasq-dns" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.260924 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.267976 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.268419 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data" (OuterVolumeSpecName: "config-data") pod "d1240189-197f-4fc9-98a7-538ffdd522da" (UID: "d1240189-197f-4fc9-98a7-538ffdd522da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.268674 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.268814 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.268910 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.343850 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-logs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344375 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344398 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-scripts\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344419 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344492 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8f9\" (UniqueName: \"kubernetes.io/projected/66c23737-b27f-4ba2-9291-b2d0f3aa5020-kube-api-access-5t8f9\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344508 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344528 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344549 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-config-data\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344606 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.344617 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1240189-197f-4fc9-98a7-538ffdd522da-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446219 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-logs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446281 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446311 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-scripts\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446335 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446390 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8f9\" (UniqueName: \"kubernetes.io/projected/66c23737-b27f-4ba2-9291-b2d0f3aa5020-kube-api-access-5t8f9\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446405 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446424 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446444 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-config-data\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.446770 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-logs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.447182 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.447289 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66c23737-b27f-4ba2-9291-b2d0f3aa5020-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.452201 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-config-data\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.452880 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.452997 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-scripts\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.465726 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8f9\" (UniqueName: \"kubernetes.io/projected/66c23737-b27f-4ba2-9291-b2d0f3aa5020-kube-api-access-5t8f9\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.469397 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c23737-b27f-4ba2-9291-b2d0f3aa5020-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.489720 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"66c23737-b27f-4ba2-9291-b2d0f3aa5020\") " pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.585551 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.949577 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1240189-197f-4fc9-98a7-538ffdd522da","Type":"ContainerDied","Data":"b8833c8a1529d44580ea3d2cf6a1bc3672160a5169797c894a80ee7234cf3c7e"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.950269 4739 scope.go:117] "RemoveContainer" containerID="8752ddc99cd48cd577f2eb560e4f1a5194fef80d51dab2eeeb20d1ab2e4bc013" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.950574 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.965913 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerStarted","Data":"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577"} Oct 08 22:09:14 crc kubenswrapper[4739]: I1008 22:09:14.981824 4739 scope.go:117] "RemoveContainer" containerID="efac728265e15cbfbd6a69f1dea53010d867fff26f1210e7d9fe3880bd92d2da" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.005675 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.036939 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.053947 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.055926 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.064057 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.064751 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.064944 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.186902 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6bs6\" (UniqueName: \"kubernetes.io/projected/a00e6724-633b-4d60-9781-206e078a6dca-kube-api-access-r6bs6\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187366 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187412 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187434 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187465 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187505 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187570 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.187614 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.303893 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.303942 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6bs6\" (UniqueName: \"kubernetes.io/projected/a00e6724-633b-4d60-9781-206e078a6dca-kube-api-access-r6bs6\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.303969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.304001 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.304016 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.304042 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.304083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.304156 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.308741 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.309045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.309673 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00e6724-633b-4d60-9781-206e078a6dca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.324972 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.334451 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.338722 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6bs6\" (UniqueName: \"kubernetes.io/projected/a00e6724-633b-4d60-9781-206e078a6dca-kube-api-access-r6bs6\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.356893 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.385878 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00e6724-633b-4d60-9781-206e078a6dca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.417211 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a00e6724-633b-4d60-9781-206e078a6dca\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.438610 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.563290 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.573668 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.618096 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffnw4\" (UniqueName: \"kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4\") pod \"442932da-356f-4a98-97a8-59ee1418fd24\" (UID: \"442932da-356f-4a98-97a8-59ee1418fd24\") " Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.634365 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4" (OuterVolumeSpecName: "kube-api-access-ffnw4") pod "442932da-356f-4a98-97a8-59ee1418fd24" (UID: "442932da-356f-4a98-97a8-59ee1418fd24"). InnerVolumeSpecName "kube-api-access-ffnw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.727438 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffnw4\" (UniqueName: \"kubernetes.io/projected/442932da-356f-4a98-97a8-59ee1418fd24-kube-api-access-ffnw4\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.752742 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.766076 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.833103 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpn6c\" (UniqueName: \"kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c\") pod \"d67308b3-9336-448a-9399-7db66f43b5aa\" (UID: \"d67308b3-9336-448a-9399-7db66f43b5aa\") " Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.833239 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s485d\" (UniqueName: \"kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d\") pod \"705882a0-d0a4-490f-9377-d9b379c5a9ea\" (UID: \"705882a0-d0a4-490f-9377-d9b379c5a9ea\") " Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.849486 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1240189-197f-4fc9-98a7-538ffdd522da" path="/var/lib/kubelet/pods/d1240189-197f-4fc9-98a7-538ffdd522da/volumes" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.851006 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc637b11-4e12-4a6a-a496-1a700d3756c1" path="/var/lib/kubelet/pods/fc637b11-4e12-4a6a-a496-1a700d3756c1/volumes" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.852477 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d" (OuterVolumeSpecName: "kube-api-access-s485d") pod "705882a0-d0a4-490f-9377-d9b379c5a9ea" (UID: "705882a0-d0a4-490f-9377-d9b379c5a9ea"). InnerVolumeSpecName "kube-api-access-s485d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.855321 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c" (OuterVolumeSpecName: "kube-api-access-lpn6c") pod "d67308b3-9336-448a-9399-7db66f43b5aa" (UID: "d67308b3-9336-448a-9399-7db66f43b5aa"). InnerVolumeSpecName "kube-api-access-lpn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.939178 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s485d\" (UniqueName: \"kubernetes.io/projected/705882a0-d0a4-490f-9377-d9b379c5a9ea-kube-api-access-s485d\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.939205 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpn6c\" (UniqueName: \"kubernetes.io/projected/d67308b3-9336-448a-9399-7db66f43b5aa-kube-api-access-lpn6c\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.976920 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5pbsk" event={"ID":"705882a0-d0a4-490f-9377-d9b379c5a9ea","Type":"ContainerDied","Data":"1cb82bd629a1af8d923f69bd43c7df9d954939d49d6077e6045a399fade29479"} Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.976963 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb82bd629a1af8d923f69bd43c7df9d954939d49d6077e6045a399fade29479" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.977009 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5pbsk" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.985210 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vx5hx" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.985234 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vx5hx" event={"ID":"d67308b3-9336-448a-9399-7db66f43b5aa","Type":"ContainerDied","Data":"b30085966d8e0e20fb18187c1a8132ea6b999cd98987465779b039d0d29f3d59"} Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.985351 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b30085966d8e0e20fb18187c1a8132ea6b999cd98987465779b039d0d29f3d59" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.988135 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-89kb6" event={"ID":"442932da-356f-4a98-97a8-59ee1418fd24","Type":"ContainerDied","Data":"56d106e5997a15b6d9f8efc2ded2b34a83f60e2ef5dc14b9487c23f0e9e9b80b"} Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.988177 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d106e5997a15b6d9f8efc2ded2b34a83f60e2ef5dc14b9487c23f0e9e9b80b" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.988185 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-89kb6" Oct 08 22:09:15 crc kubenswrapper[4739]: I1008 22:09:15.990630 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66c23737-b27f-4ba2-9291-b2d0f3aa5020","Type":"ContainerStarted","Data":"43c9359bda7ecfee0ac2b5ced9c2408d830797cc278774492ac247d5c6c0d723"} Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.260435 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.461707 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.549792 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.549928 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.550020 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.550078 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.550136 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5md2\" (UniqueName: \"kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.550248 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle\") pod \"d60573d4-e919-482c-aa2d-a46770b6c0ca\" (UID: \"d60573d4-e919-482c-aa2d-a46770b6c0ca\") " Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.551653 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.562267 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.562476 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2" (OuterVolumeSpecName: "kube-api-access-k5md2") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "kube-api-access-k5md2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.567537 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts" (OuterVolumeSpecName: "scripts") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.650310 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.652562 4739 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.652610 4739 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d60573d4-e919-482c-aa2d-a46770b6c0ca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.652628 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5md2\" (UniqueName: \"kubernetes.io/projected/d60573d4-e919-482c-aa2d-a46770b6c0ca-kube-api-access-k5md2\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.652640 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.652648 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.675000 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data" (OuterVolumeSpecName: "config-data") pod "d60573d4-e919-482c-aa2d-a46770b6c0ca" (UID: "d60573d4-e919-482c-aa2d-a46770b6c0ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:16 crc kubenswrapper[4739]: I1008 22:09:16.753862 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d60573d4-e919-482c-aa2d-a46770b6c0ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.022344 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a00e6724-633b-4d60-9781-206e078a6dca","Type":"ContainerStarted","Data":"ee1cd0fa9c4236f49d171e4e7d6f327f9d5ce14d481c3c4200c4e98dd65fae3a"} Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.030030 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66c23737-b27f-4ba2-9291-b2d0f3aa5020","Type":"ContainerStarted","Data":"f7d5ebd5637339bdaded5d8df68fe01b3645564b5d6b5dfe63def278bf9c2b0f"} Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.039256 4739 generic.go:334] "Generic (PLEG): container finished" podID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerID="1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2" exitCode=0 Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.039350 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerDied","Data":"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2"} Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.039379 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d60573d4-e919-482c-aa2d-a46770b6c0ca","Type":"ContainerDied","Data":"1baf9d660b576839912d35ac02ea2da9dabaccfaba258148211ef06958274904"} Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.039397 4739 scope.go:117] "RemoveContainer" containerID="485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.039351 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047125 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerStarted","Data":"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03"} Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047335 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-central-agent" containerID="cri-o://63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53" gracePeriod=30 Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047458 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="sg-core" containerID="cri-o://b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577" gracePeriod=30 Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047470 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-notification-agent" containerID="cri-o://d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0" gracePeriod=30 Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047525 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.047458 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="proxy-httpd" containerID="cri-o://f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03" gracePeriod=30 Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.074465 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4082159340000002 podStartE2EDuration="15.074442401s" podCreationTimestamp="2025-10-08 22:09:02 +0000 UTC" firstStartedPulling="2025-10-08 22:09:03.49230477 +0000 UTC m=+1243.317690520" lastFinishedPulling="2025-10-08 22:09:16.158531237 +0000 UTC m=+1255.983916987" observedRunningTime="2025-10-08 22:09:17.070199826 +0000 UTC m=+1256.895585576" watchObservedRunningTime="2025-10-08 22:09:17.074442401 +0000 UTC m=+1256.899828151" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.075880 4739 scope.go:117] "RemoveContainer" containerID="1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.102263 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.119352 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120519 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.120889 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="probe" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120908 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="probe" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.120924 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67308b3-9336-448a-9399-7db66f43b5aa" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120930 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67308b3-9336-448a-9399-7db66f43b5aa" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.120950 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="cinder-scheduler" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120956 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="cinder-scheduler" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.120978 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442932da-356f-4a98-97a8-59ee1418fd24" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120984 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="442932da-356f-4a98-97a8-59ee1418fd24" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.120991 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705882a0-d0a4-490f-9377-d9b379c5a9ea" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.120996 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="705882a0-d0a4-490f-9377-d9b379c5a9ea" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.121181 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="cinder-scheduler" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.121200 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="442932da-356f-4a98-97a8-59ee1418fd24" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.121212 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" containerName="probe" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.121223 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="705882a0-d0a4-490f-9377-d9b379c5a9ea" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.121234 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67308b3-9336-448a-9399-7db66f43b5aa" containerName="mariadb-database-create" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.123079 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.125308 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.128942 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164330 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164398 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164417 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c5f9170-35c8-4e75-ba48-955a58e56e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164431 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6rb\" (UniqueName: \"kubernetes.io/projected/2c5f9170-35c8-4e75-ba48-955a58e56e3f-kube-api-access-2h6rb\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164454 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.164480 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266233 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266307 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266329 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c5f9170-35c8-4e75-ba48-955a58e56e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266343 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6rb\" (UniqueName: \"kubernetes.io/projected/2c5f9170-35c8-4e75-ba48-955a58e56e3f-kube-api-access-2h6rb\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266365 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266389 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.266925 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c5f9170-35c8-4e75-ba48-955a58e56e3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.272973 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.273512 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.276504 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.276756 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5f9170-35c8-4e75-ba48-955a58e56e3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.288719 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6rb\" (UniqueName: \"kubernetes.io/projected/2c5f9170-35c8-4e75-ba48-955a58e56e3f-kube-api-access-2h6rb\") pod \"cinder-scheduler-0\" (UID: \"2c5f9170-35c8-4e75-ba48-955a58e56e3f\") " pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.315474 4739 scope.go:117] "RemoveContainer" containerID="485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.315809 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f\": container with ID starting with 485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f not found: ID does not exist" containerID="485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.315844 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f"} err="failed to get container status \"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f\": rpc error: code = NotFound desc = could not find container \"485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f\": container with ID starting with 485c6f3ec100fbef7b9e1a521ec37d35947837327d0ce9b6e81c8e2d67d5825f not found: ID does not exist" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.315869 4739 scope.go:117] "RemoveContainer" containerID="1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.316004 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:09:17 crc kubenswrapper[4739]: E1008 22:09:17.316123 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2\": container with ID starting with 1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2 not found: ID does not exist" containerID="1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.316164 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2"} err="failed to get container status \"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2\": rpc error: code = NotFound desc = could not find container \"1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2\": container with ID starting with 1b009cfc40f04e17020a23db626ab476ad3f2e130a46a92daa95313c53d149a2 not found: ID does not exist" Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.811882 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:09:17 crc kubenswrapper[4739]: I1008 22:09:17.837677 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60573d4-e919-482c-aa2d-a46770b6c0ca" path="/var/lib/kubelet/pods/d60573d4-e919-482c-aa2d-a46770b6c0ca/volumes" Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.068576 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c5f9170-35c8-4e75-ba48-955a58e56e3f","Type":"ContainerStarted","Data":"c9d7c542d7d7e335ff32ea38b70688e54fa80e67976acecec75669f124b74edc"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.072435 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66c23737-b27f-4ba2-9291-b2d0f3aa5020","Type":"ContainerStarted","Data":"b20f06d467aa0899f06e66f97f45163e9d6c3cbf253d6a567555d946d54368c2"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082077 4739 generic.go:334] "Generic (PLEG): container finished" podID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerID="f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03" exitCode=0 Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082105 4739 generic.go:334] "Generic (PLEG): container finished" podID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerID="b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577" exitCode=2 Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082113 4739 generic.go:334] "Generic (PLEG): container finished" podID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerID="d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0" exitCode=0 Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082163 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerDied","Data":"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082184 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerDied","Data":"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.082197 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerDied","Data":"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.085727 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a00e6724-633b-4d60-9781-206e078a6dca","Type":"ContainerStarted","Data":"6e1064d3a131609a01673e3f1813952f792c2da6bbb94c423640bb111af6b684"} Oct 08 22:09:18 crc kubenswrapper[4739]: I1008 22:09:18.101743 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.101696126 podStartE2EDuration="4.101696126s" podCreationTimestamp="2025-10-08 22:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:18.094774976 +0000 UTC m=+1257.920160746" watchObservedRunningTime="2025-10-08 22:09:18.101696126 +0000 UTC m=+1257.927081876" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.102065 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c5f9170-35c8-4e75-ba48-955a58e56e3f","Type":"ContainerStarted","Data":"cef90e03df391437f73b500113e0477e6b7880535b88d971f455f21cb229c60e"} Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.103018 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c5f9170-35c8-4e75-ba48-955a58e56e3f","Type":"ContainerStarted","Data":"7dea1d09222b96aef93fbdfe44e6953af6069347d94843095a680d83e5895516"} Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.106714 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a00e6724-633b-4d60-9781-206e078a6dca","Type":"ContainerStarted","Data":"9f134c653d86c46ed297a3bc50f0a104beb91dfe5a7faab47db81c87d2901b26"} Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.131440 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.131422522 podStartE2EDuration="2.131422522s" podCreationTimestamp="2025-10-08 22:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:19.12763288 +0000 UTC m=+1258.953018630" watchObservedRunningTime="2025-10-08 22:09:19.131422522 +0000 UTC m=+1258.956808262" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.153875 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.153853206 podStartE2EDuration="4.153853206s" podCreationTimestamp="2025-10-08 22:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:19.148974445 +0000 UTC m=+1258.974360195" watchObservedRunningTime="2025-10-08 22:09:19.153853206 +0000 UTC m=+1258.979238956" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.615808 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.715946 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716008 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716045 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716092 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716111 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfvdl\" (UniqueName: \"kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716139 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716285 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts\") pod \"2ae8b784-8021-4f1d-9b31-d65dce42b007\" (UID: \"2ae8b784-8021-4f1d-9b31-d65dce42b007\") " Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.716770 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.717354 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.731966 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts" (OuterVolumeSpecName: "scripts") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.748305 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl" (OuterVolumeSpecName: "kube-api-access-gfvdl") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "kube-api-access-gfvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.753235 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.817665 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.817697 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.817705 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ae8b784-8021-4f1d-9b31-d65dce42b007-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.817716 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.817725 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfvdl\" (UniqueName: \"kubernetes.io/projected/2ae8b784-8021-4f1d-9b31-d65dce42b007-kube-api-access-gfvdl\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.823455 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data" (OuterVolumeSpecName: "config-data") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.864601 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ae8b784-8021-4f1d-9b31-d65dce42b007" (UID: "2ae8b784-8021-4f1d-9b31-d65dce42b007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.918760 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:19 crc kubenswrapper[4739]: I1008 22:09:19.918794 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae8b784-8021-4f1d-9b31-d65dce42b007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.135293 4739 generic.go:334] "Generic (PLEG): container finished" podID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerID="63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53" exitCode=0 Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.135324 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.135328 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerDied","Data":"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53"} Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.135377 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ae8b784-8021-4f1d-9b31-d65dce42b007","Type":"ContainerDied","Data":"f1cf660d6b01df0d9e0186a5a44ab48ae6f3d6033e66f0e4caeb930d9185c88b"} Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.135394 4739 scope.go:117] "RemoveContainer" containerID="f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.175090 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.176126 4739 scope.go:117] "RemoveContainer" containerID="b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.178956 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.190925 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.191294 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-central-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191309 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-central-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.191331 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="sg-core" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191337 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="sg-core" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.191359 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-notification-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191366 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-notification-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.191374 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="proxy-httpd" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191380 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="proxy-httpd" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191537 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="sg-core" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191552 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="proxy-httpd" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191562 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-notification-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.191575 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" containerName="ceilometer-central-agent" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.193063 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.194828 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.195659 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.200538 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.207253 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.226353 4739 scope.go:117] "RemoveContainer" containerID="d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.285569 4739 scope.go:117] "RemoveContainer" containerID="63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.313556 4739 scope.go:117] "RemoveContainer" containerID="f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.314006 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03\": container with ID starting with f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03 not found: ID does not exist" containerID="f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314051 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03"} err="failed to get container status \"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03\": rpc error: code = NotFound desc = could not find container \"f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03\": container with ID starting with f952ef4515c0ec913f611d0905d133559f4ead9a2bd7b4ba01973962630f8f03 not found: ID does not exist" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314079 4739 scope.go:117] "RemoveContainer" containerID="b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.314396 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577\": container with ID starting with b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577 not found: ID does not exist" containerID="b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314421 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577"} err="failed to get container status \"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577\": rpc error: code = NotFound desc = could not find container \"b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577\": container with ID starting with b0ac310e5d85d6868254c2f560a6df4c7a3a1316d81e0a8ba7d907e7eb971577 not found: ID does not exist" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314434 4739 scope.go:117] "RemoveContainer" containerID="d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.314706 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0\": container with ID starting with d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0 not found: ID does not exist" containerID="d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314724 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0"} err="failed to get container status \"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0\": rpc error: code = NotFound desc = could not find container \"d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0\": container with ID starting with d79f3d99f4be1d76d1d418791f7367c6f69cfe3aafc94a9a0a31c08172d03ff0 not found: ID does not exist" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314735 4739 scope.go:117] "RemoveContainer" containerID="63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53" Oct 08 22:09:20 crc kubenswrapper[4739]: E1008 22:09:20.314953 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53\": container with ID starting with 63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53 not found: ID does not exist" containerID="63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.314974 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53"} err="failed to get container status \"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53\": rpc error: code = NotFound desc = could not find container \"63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53\": container with ID starting with 63c45ac6e32f2e9f889da3c2265a57da991b708a9c5ae653a2abfd15c061bb53 not found: ID does not exist" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.328128 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333482 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333559 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333615 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9flp\" (UniqueName: \"kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333761 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333904 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.333930 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436055 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436109 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436294 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9flp\" (UniqueName: \"kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436350 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436416 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436474 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436894 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.436988 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.444550 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.444724 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.445208 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.455138 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.455328 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9flp\" (UniqueName: \"kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp\") pod \"ceilometer-0\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.514732 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:20 crc kubenswrapper[4739]: W1008 22:09:20.987628 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc56b1bf5_c307_4cfc_9d75_af27f8a34537.slice/crio-8892c3893164f7c71fe9bd87371bc7330d396d31d8b76a4f1255b8fd22f9f652 WatchSource:0}: Error finding container 8892c3893164f7c71fe9bd87371bc7330d396d31d8b76a4f1255b8fd22f9f652: Status 404 returned error can't find the container with id 8892c3893164f7c71fe9bd87371bc7330d396d31d8b76a4f1255b8fd22f9f652 Oct 08 22:09:20 crc kubenswrapper[4739]: I1008 22:09:20.992037 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.147286 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerStarted","Data":"8892c3893164f7c71fe9bd87371bc7330d396d31d8b76a4f1255b8fd22f9f652"} Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.766833 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.767322 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.767372 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.768264 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.768341 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9" gracePeriod=600 Oct 08 22:09:21 crc kubenswrapper[4739]: I1008 22:09:21.840443 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae8b784-8021-4f1d-9b31-d65dce42b007" path="/var/lib/kubelet/pods/2ae8b784-8021-4f1d-9b31-d65dce42b007/volumes" Oct 08 22:09:22 crc kubenswrapper[4739]: I1008 22:09:22.160830 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerStarted","Data":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} Oct 08 22:09:22 crc kubenswrapper[4739]: I1008 22:09:22.165227 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9" exitCode=0 Oct 08 22:09:22 crc kubenswrapper[4739]: I1008 22:09:22.165267 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9"} Oct 08 22:09:22 crc kubenswrapper[4739]: I1008 22:09:22.165321 4739 scope.go:117] "RemoveContainer" containerID="d72a751240c9e64050ad684bd757cf43d33579885b0db0ae42dad5cf5bb4da84" Oct 08 22:09:22 crc kubenswrapper[4739]: I1008 22:09:22.316903 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 22:09:23 crc kubenswrapper[4739]: I1008 22:09:23.197035 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf"} Oct 08 22:09:23 crc kubenswrapper[4739]: I1008 22:09:23.556167 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:24 crc kubenswrapper[4739]: I1008 22:09:24.213397 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerStarted","Data":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} Oct 08 22:09:24 crc kubenswrapper[4739]: I1008 22:09:24.585778 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:09:24 crc kubenswrapper[4739]: I1008 22:09:24.585843 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:09:24 crc kubenswrapper[4739]: I1008 22:09:24.634433 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:09:24 crc kubenswrapper[4739]: I1008 22:09:24.648073 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.240792 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerStarted","Data":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.241373 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.241971 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.441409 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.441458 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.499096 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:25 crc kubenswrapper[4739]: I1008 22:09:25.515102 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:26 crc kubenswrapper[4739]: I1008 22:09:26.251575 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:26 crc kubenswrapper[4739]: I1008 22:09:26.252060 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.118964 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.119394 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.263315 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerStarted","Data":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.264325 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-central-agent" containerID="cri-o://4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" gracePeriod=30 Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.264388 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="proxy-httpd" containerID="cri-o://d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" gracePeriod=30 Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.264432 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-notification-agent" containerID="cri-o://912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" gracePeriod=30 Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.264420 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="sg-core" containerID="cri-o://75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" gracePeriod=30 Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.264476 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.306499 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.779573542 podStartE2EDuration="7.306480118s" podCreationTimestamp="2025-10-08 22:09:20 +0000 UTC" firstStartedPulling="2025-10-08 22:09:20.990202124 +0000 UTC m=+1260.815587874" lastFinishedPulling="2025-10-08 22:09:26.5171087 +0000 UTC m=+1266.342494450" observedRunningTime="2025-10-08 22:09:27.302799518 +0000 UTC m=+1267.128185268" watchObservedRunningTime="2025-10-08 22:09:27.306480118 +0000 UTC m=+1267.131865868" Oct 08 22:09:27 crc kubenswrapper[4739]: I1008 22:09:27.547816 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.049401 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.194798 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.194886 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9flp\" (UniqueName: \"kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.194946 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.194991 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.195101 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.195199 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.195614 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.195650 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd\") pod \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\" (UID: \"c56b1bf5-c307-4cfc-9d75-af27f8a34537\") " Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.196023 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.196788 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.196805 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56b1bf5-c307-4cfc-9d75-af27f8a34537-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.210595 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts" (OuterVolumeSpecName: "scripts") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.211630 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp" (OuterVolumeSpecName: "kube-api-access-f9flp") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "kube-api-access-f9flp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.233449 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277389 4739 generic.go:334] "Generic (PLEG): container finished" podID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" exitCode=0 Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277426 4739 generic.go:334] "Generic (PLEG): container finished" podID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" exitCode=2 Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277434 4739 generic.go:334] "Generic (PLEG): container finished" podID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" exitCode=0 Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277441 4739 generic.go:334] "Generic (PLEG): container finished" podID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" exitCode=0 Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277446 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277501 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277511 4739 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277473 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerDied","Data":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277596 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerDied","Data":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277609 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerDied","Data":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277617 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerDied","Data":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277628 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56b1bf5-c307-4cfc-9d75-af27f8a34537","Type":"ContainerDied","Data":"8892c3893164f7c71fe9bd87371bc7330d396d31d8b76a4f1255b8fd22f9f652"} Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.277645 4739 scope.go:117] "RemoveContainer" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.286792 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.298941 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9flp\" (UniqueName: \"kubernetes.io/projected/c56b1bf5-c307-4cfc-9d75-af27f8a34537-kube-api-access-f9flp\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.298966 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.298992 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.299002 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.308404 4739 scope.go:117] "RemoveContainer" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.329876 4739 scope.go:117] "RemoveContainer" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.332428 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data" (OuterVolumeSpecName: "config-data") pod "c56b1bf5-c307-4cfc-9d75-af27f8a34537" (UID: "c56b1bf5-c307-4cfc-9d75-af27f8a34537"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.366181 4739 scope.go:117] "RemoveContainer" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.397605 4739 scope.go:117] "RemoveContainer" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.398439 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": container with ID starting with d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63 not found: ID does not exist" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.398494 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} err="failed to get container status \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": rpc error: code = NotFound desc = could not find container \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": container with ID starting with d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.398526 4739 scope.go:117] "RemoveContainer" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.398983 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": container with ID starting with 75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097 not found: ID does not exist" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.399020 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} err="failed to get container status \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": rpc error: code = NotFound desc = could not find container \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": container with ID starting with 75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.399045 4739 scope.go:117] "RemoveContainer" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.399564 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": container with ID starting with 912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd not found: ID does not exist" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.399627 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} err="failed to get container status \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": rpc error: code = NotFound desc = could not find container \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": container with ID starting with 912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.399662 4739 scope.go:117] "RemoveContainer" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.400235 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": container with ID starting with 4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b not found: ID does not exist" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.400266 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} err="failed to get container status \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": rpc error: code = NotFound desc = could not find container \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": container with ID starting with 4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.400283 4739 scope.go:117] "RemoveContainer" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.400687 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56b1bf5-c307-4cfc-9d75-af27f8a34537-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.400867 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} err="failed to get container status \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": rpc error: code = NotFound desc = could not find container \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": container with ID starting with d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.400939 4739 scope.go:117] "RemoveContainer" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.401425 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} err="failed to get container status \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": rpc error: code = NotFound desc = could not find container \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": container with ID starting with 75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.401458 4739 scope.go:117] "RemoveContainer" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.401805 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} err="failed to get container status \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": rpc error: code = NotFound desc = could not find container \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": container with ID starting with 912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.401859 4739 scope.go:117] "RemoveContainer" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402218 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} err="failed to get container status \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": rpc error: code = NotFound desc = could not find container \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": container with ID starting with 4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402244 4739 scope.go:117] "RemoveContainer" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402499 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} err="failed to get container status \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": rpc error: code = NotFound desc = could not find container \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": container with ID starting with d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402524 4739 scope.go:117] "RemoveContainer" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402773 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} err="failed to get container status \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": rpc error: code = NotFound desc = could not find container \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": container with ID starting with 75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.402796 4739 scope.go:117] "RemoveContainer" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403033 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} err="failed to get container status \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": rpc error: code = NotFound desc = could not find container \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": container with ID starting with 912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403057 4739 scope.go:117] "RemoveContainer" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403345 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} err="failed to get container status \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": rpc error: code = NotFound desc = could not find container \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": container with ID starting with 4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403372 4739 scope.go:117] "RemoveContainer" containerID="d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403580 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63"} err="failed to get container status \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": rpc error: code = NotFound desc = could not find container \"d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63\": container with ID starting with d2bfc4e4ee0ec794257969c61ee67faa297515e9f732290ecd836d9ba51abb63 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403606 4739 scope.go:117] "RemoveContainer" containerID="75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403834 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097"} err="failed to get container status \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": rpc error: code = NotFound desc = could not find container \"75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097\": container with ID starting with 75d89f1111b4e1f077a814640e6d67dc29602cb8b9e29c9dd06567ea80802097 not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.403858 4739 scope.go:117] "RemoveContainer" containerID="912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.404101 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd"} err="failed to get container status \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": rpc error: code = NotFound desc = could not find container \"912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd\": container with ID starting with 912bd49712e469aacf158579604cc56bee2facb9f5265068f50d782d4a146fdd not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.404129 4739 scope.go:117] "RemoveContainer" containerID="4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.404387 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b"} err="failed to get container status \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": rpc error: code = NotFound desc = could not find container \"4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b\": container with ID starting with 4ed9b4a2f2a04df8a35cc4669fbeb81975d5ec2993943455bacadcb385e68c7b not found: ID does not exist" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.451175 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.453069 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.703435 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.730067 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.739703 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.740464 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="sg-core" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.740562 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="sg-core" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.740638 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="proxy-httpd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.740689 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="proxy-httpd" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.740754 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-notification-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.740809 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-notification-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: E1008 22:09:28.740877 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-central-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.740937 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-central-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.741198 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="sg-core" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.741278 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-central-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.741358 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="proxy-httpd" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.741442 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" containerName="ceilometer-notification-agent" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.743857 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.746974 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.747189 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.753791 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.809951 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810019 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810320 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810408 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810490 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810508 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.810591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzsjm\" (UniqueName: \"kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913232 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913307 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913385 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913419 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913447 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913463 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913490 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzsjm\" (UniqueName: \"kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.913887 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.914533 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.919050 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.922564 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.923108 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.926461 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:28 crc kubenswrapper[4739]: I1008 22:09:28.931112 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzsjm\" (UniqueName: \"kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm\") pod \"ceilometer-0\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " pod="openstack/ceilometer-0" Oct 08 22:09:29 crc kubenswrapper[4739]: I1008 22:09:29.068848 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:29 crc kubenswrapper[4739]: I1008 22:09:29.559389 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:29 crc kubenswrapper[4739]: I1008 22:09:29.836829 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56b1bf5-c307-4cfc-9d75-af27f8a34537" path="/var/lib/kubelet/pods/c56b1bf5-c307-4cfc-9d75-af27f8a34537/volumes" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.311640 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerStarted","Data":"a5c00fa73855baa2d51cc1f5629a493501de162dd8469f6b689617568384620e"} Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.671791 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ec7f-account-create-xrvtt"] Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.673218 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.679584 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.684376 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ec7f-account-create-xrvtt"] Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.756861 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8zft\" (UniqueName: \"kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft\") pod \"nova-api-ec7f-account-create-xrvtt\" (UID: \"050eba89-d876-415d-a4bf-fa42c414ec27\") " pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.859349 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8zft\" (UniqueName: \"kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft\") pod \"nova-api-ec7f-account-create-xrvtt\" (UID: \"050eba89-d876-415d-a4bf-fa42c414ec27\") " pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.872669 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-24d3-account-create-t6jbj"] Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.874091 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.886639 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-24d3-account-create-t6jbj"] Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.887440 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.909510 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8zft\" (UniqueName: \"kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft\") pod \"nova-api-ec7f-account-create-xrvtt\" (UID: \"050eba89-d876-415d-a4bf-fa42c414ec27\") " pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.961511 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gxc\" (UniqueName: \"kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc\") pod \"nova-cell0-24d3-account-create-t6jbj\" (UID: \"fd658aa2-8067-4957-b4c1-3d3be9f6496b\") " pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:30 crc kubenswrapper[4739]: I1008 22:09:30.997776 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.063569 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88gxc\" (UniqueName: \"kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc\") pod \"nova-cell0-24d3-account-create-t6jbj\" (UID: \"fd658aa2-8067-4957-b4c1-3d3be9f6496b\") " pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.067832 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-76e4-account-create-w4hrz"] Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.070689 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.073691 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.089491 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-76e4-account-create-w4hrz"] Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.102739 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gxc\" (UniqueName: \"kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc\") pod \"nova-cell0-24d3-account-create-t6jbj\" (UID: \"fd658aa2-8067-4957-b4c1-3d3be9f6496b\") " pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.129935 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.166188 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpx2s\" (UniqueName: \"kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s\") pod \"nova-cell1-76e4-account-create-w4hrz\" (UID: \"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf\") " pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.267929 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpx2s\" (UniqueName: \"kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s\") pod \"nova-cell1-76e4-account-create-w4hrz\" (UID: \"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf\") " pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.305752 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpx2s\" (UniqueName: \"kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s\") pod \"nova-cell1-76e4-account-create-w4hrz\" (UID: \"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf\") " pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.333092 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerStarted","Data":"066e49770979e14d62ea7e04e7762d3c2e60f8c215ec283f8f6fb7a244ba982a"} Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.333171 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerStarted","Data":"e53f0e8052c9e27375caf46676a07f9ea22aad1b27f45af89fcfb27852f43fcd"} Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.452230 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.505455 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ec7f-account-create-xrvtt"] Oct 08 22:09:31 crc kubenswrapper[4739]: W1008 22:09:31.507006 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod050eba89_d876_415d_a4bf_fa42c414ec27.slice/crio-8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7 WatchSource:0}: Error finding container 8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7: Status 404 returned error can't find the container with id 8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7 Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.629614 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-24d3-account-create-t6jbj"] Oct 08 22:09:31 crc kubenswrapper[4739]: W1008 22:09:31.632697 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd658aa2_8067_4957_b4c1_3d3be9f6496b.slice/crio-af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3 WatchSource:0}: Error finding container af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3: Status 404 returned error can't find the container with id af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3 Oct 08 22:09:31 crc kubenswrapper[4739]: I1008 22:09:31.973021 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-76e4-account-create-w4hrz"] Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.374319 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-76e4-account-create-w4hrz" event={"ID":"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf","Type":"ContainerStarted","Data":"0872ab24d84212478d8f517f197aac3df2b001a3a2bd7f8c3dff426124a0a309"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.374366 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-76e4-account-create-w4hrz" event={"ID":"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf","Type":"ContainerStarted","Data":"7000285baaaa01ef97de38fa3ea5c3f2a8e06ad371120d0a037e31b284e83c7e"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.376436 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ec7f-account-create-xrvtt" event={"ID":"050eba89-d876-415d-a4bf-fa42c414ec27","Type":"ContainerStarted","Data":"3117d288a49c0224c3098e216ddd7692185e694c28f0f6824d569cadf09ce0b3"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.376465 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ec7f-account-create-xrvtt" event={"ID":"050eba89-d876-415d-a4bf-fa42c414ec27","Type":"ContainerStarted","Data":"8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.386978 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-24d3-account-create-t6jbj" event={"ID":"fd658aa2-8067-4957-b4c1-3d3be9f6496b","Type":"ContainerStarted","Data":"d7274f341767e00ac52c996486b9c9ef091cc12452f25d07fbe10fb0334914fc"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.387166 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-24d3-account-create-t6jbj" event={"ID":"fd658aa2-8067-4957-b4c1-3d3be9f6496b","Type":"ContainerStarted","Data":"af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3"} Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.428425 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-76e4-account-create-w4hrz" podStartSLOduration=1.428405772 podStartE2EDuration="1.428405772s" podCreationTimestamp="2025-10-08 22:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:32.414418188 +0000 UTC m=+1272.239803928" watchObservedRunningTime="2025-10-08 22:09:32.428405772 +0000 UTC m=+1272.253791522" Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.441411 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ec7f-account-create-xrvtt" podStartSLOduration=2.441392982 podStartE2EDuration="2.441392982s" podCreationTimestamp="2025-10-08 22:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:32.434200716 +0000 UTC m=+1272.259586466" watchObservedRunningTime="2025-10-08 22:09:32.441392982 +0000 UTC m=+1272.266778732" Oct 08 22:09:32 crc kubenswrapper[4739]: I1008 22:09:32.487521 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-24d3-account-create-t6jbj" podStartSLOduration=2.487499378 podStartE2EDuration="2.487499378s" podCreationTimestamp="2025-10-08 22:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:09:32.478995499 +0000 UTC m=+1272.304381249" watchObservedRunningTime="2025-10-08 22:09:32.487499378 +0000 UTC m=+1272.312885128" Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.412255 4739 generic.go:334] "Generic (PLEG): container finished" podID="fd658aa2-8067-4957-b4c1-3d3be9f6496b" containerID="d7274f341767e00ac52c996486b9c9ef091cc12452f25d07fbe10fb0334914fc" exitCode=0 Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.412554 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-24d3-account-create-t6jbj" event={"ID":"fd658aa2-8067-4957-b4c1-3d3be9f6496b","Type":"ContainerDied","Data":"d7274f341767e00ac52c996486b9c9ef091cc12452f25d07fbe10fb0334914fc"} Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.416675 4739 generic.go:334] "Generic (PLEG): container finished" podID="96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" containerID="0872ab24d84212478d8f517f197aac3df2b001a3a2bd7f8c3dff426124a0a309" exitCode=0 Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.416745 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-76e4-account-create-w4hrz" event={"ID":"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf","Type":"ContainerDied","Data":"0872ab24d84212478d8f517f197aac3df2b001a3a2bd7f8c3dff426124a0a309"} Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.419863 4739 generic.go:334] "Generic (PLEG): container finished" podID="050eba89-d876-415d-a4bf-fa42c414ec27" containerID="3117d288a49c0224c3098e216ddd7692185e694c28f0f6824d569cadf09ce0b3" exitCode=0 Oct 08 22:09:34 crc kubenswrapper[4739]: I1008 22:09:34.419946 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ec7f-account-create-xrvtt" event={"ID":"050eba89-d876-415d-a4bf-fa42c414ec27","Type":"ContainerDied","Data":"3117d288a49c0224c3098e216ddd7692185e694c28f0f6824d569cadf09ce0b3"} Oct 08 22:09:35 crc kubenswrapper[4739]: I1008 22:09:35.871169 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:35 crc kubenswrapper[4739]: I1008 22:09:35.976006 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpx2s\" (UniqueName: \"kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s\") pod \"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf\" (UID: \"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf\") " Oct 08 22:09:35 crc kubenswrapper[4739]: I1008 22:09:35.985116 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s" (OuterVolumeSpecName: "kube-api-access-jpx2s") pod "96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" (UID: "96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf"). InnerVolumeSpecName "kube-api-access-jpx2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.060954 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.063987 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.079072 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpx2s\" (UniqueName: \"kubernetes.io/projected/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf-kube-api-access-jpx2s\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.180272 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88gxc\" (UniqueName: \"kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc\") pod \"fd658aa2-8067-4957-b4c1-3d3be9f6496b\" (UID: \"fd658aa2-8067-4957-b4c1-3d3be9f6496b\") " Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.180562 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8zft\" (UniqueName: \"kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft\") pod \"050eba89-d876-415d-a4bf-fa42c414ec27\" (UID: \"050eba89-d876-415d-a4bf-fa42c414ec27\") " Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.184647 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft" (OuterVolumeSpecName: "kube-api-access-v8zft") pod "050eba89-d876-415d-a4bf-fa42c414ec27" (UID: "050eba89-d876-415d-a4bf-fa42c414ec27"). InnerVolumeSpecName "kube-api-access-v8zft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.185535 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc" (OuterVolumeSpecName: "kube-api-access-88gxc") pod "fd658aa2-8067-4957-b4c1-3d3be9f6496b" (UID: "fd658aa2-8067-4957-b4c1-3d3be9f6496b"). InnerVolumeSpecName "kube-api-access-88gxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.284344 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88gxc\" (UniqueName: \"kubernetes.io/projected/fd658aa2-8067-4957-b4c1-3d3be9f6496b-kube-api-access-88gxc\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.284427 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8zft\" (UniqueName: \"kubernetes.io/projected/050eba89-d876-415d-a4bf-fa42c414ec27-kube-api-access-v8zft\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.456409 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-76e4-account-create-w4hrz" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.457140 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-76e4-account-create-w4hrz" event={"ID":"96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf","Type":"ContainerDied","Data":"7000285baaaa01ef97de38fa3ea5c3f2a8e06ad371120d0a037e31b284e83c7e"} Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.457402 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7000285baaaa01ef97de38fa3ea5c3f2a8e06ad371120d0a037e31b284e83c7e" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.463383 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ec7f-account-create-xrvtt" event={"ID":"050eba89-d876-415d-a4bf-fa42c414ec27","Type":"ContainerDied","Data":"8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7"} Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.463443 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c061730674b795f3c4c226dba6134a2e3e9e63e2ef52b825bb5273e58ba9ad7" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.463977 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ec7f-account-create-xrvtt" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.466803 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-24d3-account-create-t6jbj" event={"ID":"fd658aa2-8067-4957-b4c1-3d3be9f6496b","Type":"ContainerDied","Data":"af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3"} Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.466857 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af4d489174ddce629e3481919be325c795938517f43f7e66ccb33744faa6ffb3" Oct 08 22:09:36 crc kubenswrapper[4739]: I1008 22:09:36.466937 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-24d3-account-create-t6jbj" Oct 08 22:09:38 crc kubenswrapper[4739]: I1008 22:09:38.496451 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerStarted","Data":"d038109cbcdd81fc544a5ed2540c6b33992270354c9264c2dc501d752d363347"} Oct 08 22:09:40 crc kubenswrapper[4739]: I1008 22:09:40.525759 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerStarted","Data":"a9af6de0c7f037ce0df7aa1190743268728cd0b53c3387daebd6a4e26df9a303"} Oct 08 22:09:40 crc kubenswrapper[4739]: I1008 22:09:40.527482 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:09:40 crc kubenswrapper[4739]: I1008 22:09:40.572356 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.043911305 podStartE2EDuration="12.572329771s" podCreationTimestamp="2025-10-08 22:09:28 +0000 UTC" firstStartedPulling="2025-10-08 22:09:29.567483924 +0000 UTC m=+1269.392869664" lastFinishedPulling="2025-10-08 22:09:40.09590237 +0000 UTC m=+1279.921288130" observedRunningTime="2025-10-08 22:09:40.564118739 +0000 UTC m=+1280.389504569" watchObservedRunningTime="2025-10-08 22:09:40.572329771 +0000 UTC m=+1280.397715551" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203379 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nwcm7"] Oct 08 22:09:41 crc kubenswrapper[4739]: E1008 22:09:41.203740 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203752 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: E1008 22:09:41.203771 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd658aa2-8067-4957-b4c1-3d3be9f6496b" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203777 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd658aa2-8067-4957-b4c1-3d3be9f6496b" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: E1008 22:09:41.203790 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050eba89-d876-415d-a4bf-fa42c414ec27" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203796 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="050eba89-d876-415d-a4bf-fa42c414ec27" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203969 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="050eba89-d876-415d-a4bf-fa42c414ec27" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.203993 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.204002 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd658aa2-8067-4957-b4c1-3d3be9f6496b" containerName="mariadb-account-create" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.204569 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.206378 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.207264 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ntqbj" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.207501 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.221613 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nwcm7"] Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.326425 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.326520 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.326921 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjjp\" (UniqueName: \"kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.327364 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.429938 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.430016 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.430068 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjjp\" (UniqueName: \"kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.430127 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.436522 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.436781 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.448640 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.456678 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjjp\" (UniqueName: \"kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp\") pod \"nova-cell0-conductor-db-sync-nwcm7\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:41 crc kubenswrapper[4739]: I1008 22:09:41.531532 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:09:42 crc kubenswrapper[4739]: I1008 22:09:42.099812 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nwcm7"] Oct 08 22:09:42 crc kubenswrapper[4739]: W1008 22:09:42.108145 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd8cd516_b57f_4dc7_913f_fca9eac68452.slice/crio-457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0 WatchSource:0}: Error finding container 457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0: Status 404 returned error can't find the container with id 457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0 Oct 08 22:09:42 crc kubenswrapper[4739]: I1008 22:09:42.569791 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" event={"ID":"cd8cd516-b57f-4dc7-913f-fca9eac68452","Type":"ContainerStarted","Data":"457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0"} Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.044203 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.045081 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-central-agent" containerID="cri-o://e53f0e8052c9e27375caf46676a07f9ea22aad1b27f45af89fcfb27852f43fcd" gracePeriod=30 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.045216 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="proxy-httpd" containerID="cri-o://a9af6de0c7f037ce0df7aa1190743268728cd0b53c3387daebd6a4e26df9a303" gracePeriod=30 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.045303 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-notification-agent" containerID="cri-o://066e49770979e14d62ea7e04e7762d3c2e60f8c215ec283f8f6fb7a244ba982a" gracePeriod=30 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.045328 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="sg-core" containerID="cri-o://d038109cbcdd81fc544a5ed2540c6b33992270354c9264c2dc501d752d363347" gracePeriod=30 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637014 4739 generic.go:334] "Generic (PLEG): container finished" podID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerID="a9af6de0c7f037ce0df7aa1190743268728cd0b53c3387daebd6a4e26df9a303" exitCode=0 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637445 4739 generic.go:334] "Generic (PLEG): container finished" podID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerID="d038109cbcdd81fc544a5ed2540c6b33992270354c9264c2dc501d752d363347" exitCode=2 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637461 4739 generic.go:334] "Generic (PLEG): container finished" podID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerID="e53f0e8052c9e27375caf46676a07f9ea22aad1b27f45af89fcfb27852f43fcd" exitCode=0 Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637122 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerDied","Data":"a9af6de0c7f037ce0df7aa1190743268728cd0b53c3387daebd6a4e26df9a303"} Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637511 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerDied","Data":"d038109cbcdd81fc544a5ed2540c6b33992270354c9264c2dc501d752d363347"} Oct 08 22:09:48 crc kubenswrapper[4739]: I1008 22:09:48.637534 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerDied","Data":"e53f0e8052c9e27375caf46676a07f9ea22aad1b27f45af89fcfb27852f43fcd"} Oct 08 22:09:49 crc kubenswrapper[4739]: I1008 22:09:49.650484 4739 generic.go:334] "Generic (PLEG): container finished" podID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerID="066e49770979e14d62ea7e04e7762d3c2e60f8c215ec283f8f6fb7a244ba982a" exitCode=0 Oct 08 22:09:49 crc kubenswrapper[4739]: I1008 22:09:49.650529 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerDied","Data":"066e49770979e14d62ea7e04e7762d3c2e60f8c215ec283f8f6fb7a244ba982a"} Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.795338 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.954602 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.954649 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.954774 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.954842 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.955297 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.955512 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.955550 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzsjm\" (UniqueName: \"kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.955580 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.955605 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle\") pod \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\" (UID: \"a11e83d2-32ae-4ec8-9dc7-d020a00ca231\") " Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.956352 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.956366 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.960332 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm" (OuterVolumeSpecName: "kube-api-access-qzsjm") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "kube-api-access-qzsjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:09:50 crc kubenswrapper[4739]: I1008 22:09:50.960395 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts" (OuterVolumeSpecName: "scripts") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.005515 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.023765 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.058819 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.059100 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.059279 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzsjm\" (UniqueName: \"kubernetes.io/projected/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-kube-api-access-qzsjm\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.059402 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.125596 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data" (OuterVolumeSpecName: "config-data") pod "a11e83d2-32ae-4ec8-9dc7-d020a00ca231" (UID: "a11e83d2-32ae-4ec8-9dc7-d020a00ca231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.160936 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a11e83d2-32ae-4ec8-9dc7-d020a00ca231-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.671876 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" event={"ID":"cd8cd516-b57f-4dc7-913f-fca9eac68452","Type":"ContainerStarted","Data":"2a6a4c7fda63a61e43323201d4ec7a829e757e5bbe9d0a1e0a52359cdd793563"} Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.675003 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a11e83d2-32ae-4ec8-9dc7-d020a00ca231","Type":"ContainerDied","Data":"a5c00fa73855baa2d51cc1f5629a493501de162dd8469f6b689617568384620e"} Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.675287 4739 scope.go:117] "RemoveContainer" containerID="a9af6de0c7f037ce0df7aa1190743268728cd0b53c3387daebd6a4e26df9a303" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.675211 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.714184 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" podStartSLOduration=2.354600033 podStartE2EDuration="10.714112791s" podCreationTimestamp="2025-10-08 22:09:41 +0000 UTC" firstStartedPulling="2025-10-08 22:09:42.111677267 +0000 UTC m=+1281.937063027" lastFinishedPulling="2025-10-08 22:09:50.471190035 +0000 UTC m=+1290.296575785" observedRunningTime="2025-10-08 22:09:51.707023806 +0000 UTC m=+1291.532409596" watchObservedRunningTime="2025-10-08 22:09:51.714112791 +0000 UTC m=+1291.539498571" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.736823 4739 scope.go:117] "RemoveContainer" containerID="d038109cbcdd81fc544a5ed2540c6b33992270354c9264c2dc501d752d363347" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.741645 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.758306 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.779827 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:51 crc kubenswrapper[4739]: E1008 22:09:51.780276 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="proxy-httpd" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780293 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="proxy-httpd" Oct 08 22:09:51 crc kubenswrapper[4739]: E1008 22:09:51.780315 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-notification-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780321 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-notification-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: E1008 22:09:51.780330 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-central-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780339 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-central-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: E1008 22:09:51.780350 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="sg-core" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780356 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="sg-core" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780568 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-central-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780592 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="proxy-httpd" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780601 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="ceilometer-notification-agent" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.780632 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" containerName="sg-core" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.782310 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.786354 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.786649 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.787925 4739 scope.go:117] "RemoveContainer" containerID="066e49770979e14d62ea7e04e7762d3c2e60f8c215ec283f8f6fb7a244ba982a" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.806646 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.840597 4739 scope.go:117] "RemoveContainer" containerID="e53f0e8052c9e27375caf46676a07f9ea22aad1b27f45af89fcfb27852f43fcd" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.840903 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11e83d2-32ae-4ec8-9dc7-d020a00ca231" path="/var/lib/kubelet/pods/a11e83d2-32ae-4ec8-9dc7-d020a00ca231/volumes" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.976907 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977050 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977105 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977195 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977338 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977386 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:51 crc kubenswrapper[4739]: I1008 22:09:51.977456 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zv7\" (UniqueName: \"kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079250 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079418 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079467 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079518 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zv7\" (UniqueName: \"kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079653 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079731 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.079793 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.082090 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.082115 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.087089 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.087894 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.089838 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.109717 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.116058 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zv7\" (UniqueName: \"kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7\") pod \"ceilometer-0\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.406894 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:09:52 crc kubenswrapper[4739]: I1008 22:09:52.920770 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:09:53 crc kubenswrapper[4739]: I1008 22:09:53.695296 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerStarted","Data":"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea"} Oct 08 22:09:53 crc kubenswrapper[4739]: I1008 22:09:53.695353 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerStarted","Data":"bd73912e946492560f056335d7dc263fbd89e5629ed52525b588aae710423864"} Oct 08 22:09:54 crc kubenswrapper[4739]: I1008 22:09:54.711979 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerStarted","Data":"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a"} Oct 08 22:09:55 crc kubenswrapper[4739]: I1008 22:09:55.753607 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerStarted","Data":"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7"} Oct 08 22:09:57 crc kubenswrapper[4739]: I1008 22:09:57.786819 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerStarted","Data":"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299"} Oct 08 22:09:57 crc kubenswrapper[4739]: I1008 22:09:57.787766 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:09:57 crc kubenswrapper[4739]: I1008 22:09:57.828359 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.241749572 podStartE2EDuration="6.828328489s" podCreationTimestamp="2025-10-08 22:09:51 +0000 UTC" firstStartedPulling="2025-10-08 22:09:52.923482191 +0000 UTC m=+1292.748868101" lastFinishedPulling="2025-10-08 22:09:56.510061268 +0000 UTC m=+1296.335447018" observedRunningTime="2025-10-08 22:09:57.818568309 +0000 UTC m=+1297.643954099" watchObservedRunningTime="2025-10-08 22:09:57.828328489 +0000 UTC m=+1297.653714299" Oct 08 22:10:07 crc kubenswrapper[4739]: I1008 22:10:07.916626 4739 generic.go:334] "Generic (PLEG): container finished" podID="cd8cd516-b57f-4dc7-913f-fca9eac68452" containerID="2a6a4c7fda63a61e43323201d4ec7a829e757e5bbe9d0a1e0a52359cdd793563" exitCode=0 Oct 08 22:10:07 crc kubenswrapper[4739]: I1008 22:10:07.916797 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" event={"ID":"cd8cd516-b57f-4dc7-913f-fca9eac68452","Type":"ContainerDied","Data":"2a6a4c7fda63a61e43323201d4ec7a829e757e5bbe9d0a1e0a52359cdd793563"} Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.388979 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.511753 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle\") pod \"cd8cd516-b57f-4dc7-913f-fca9eac68452\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.511838 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts\") pod \"cd8cd516-b57f-4dc7-913f-fca9eac68452\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.511885 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data\") pod \"cd8cd516-b57f-4dc7-913f-fca9eac68452\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.511949 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjjp\" (UniqueName: \"kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp\") pod \"cd8cd516-b57f-4dc7-913f-fca9eac68452\" (UID: \"cd8cd516-b57f-4dc7-913f-fca9eac68452\") " Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.518678 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts" (OuterVolumeSpecName: "scripts") pod "cd8cd516-b57f-4dc7-913f-fca9eac68452" (UID: "cd8cd516-b57f-4dc7-913f-fca9eac68452"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.520545 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp" (OuterVolumeSpecName: "kube-api-access-2jjjp") pod "cd8cd516-b57f-4dc7-913f-fca9eac68452" (UID: "cd8cd516-b57f-4dc7-913f-fca9eac68452"). InnerVolumeSpecName "kube-api-access-2jjjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.541102 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data" (OuterVolumeSpecName: "config-data") pod "cd8cd516-b57f-4dc7-913f-fca9eac68452" (UID: "cd8cd516-b57f-4dc7-913f-fca9eac68452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.548512 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd8cd516-b57f-4dc7-913f-fca9eac68452" (UID: "cd8cd516-b57f-4dc7-913f-fca9eac68452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.614405 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjjp\" (UniqueName: \"kubernetes.io/projected/cd8cd516-b57f-4dc7-913f-fca9eac68452-kube-api-access-2jjjp\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.614453 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.614464 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.614475 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8cd516-b57f-4dc7-913f-fca9eac68452-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.945669 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" event={"ID":"cd8cd516-b57f-4dc7-913f-fca9eac68452","Type":"ContainerDied","Data":"457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0"} Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.945727 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457e0593025df2b8819ad7ebd7627497a6ad34ed1cf9ea364764d364ad4f88e0" Oct 08 22:10:09 crc kubenswrapper[4739]: I1008 22:10:09.945729 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nwcm7" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.089178 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:10:10 crc kubenswrapper[4739]: E1008 22:10:10.090833 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8cd516-b57f-4dc7-913f-fca9eac68452" containerName="nova-cell0-conductor-db-sync" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.090883 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8cd516-b57f-4dc7-913f-fca9eac68452" containerName="nova-cell0-conductor-db-sync" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.092267 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8cd516-b57f-4dc7-913f-fca9eac68452" containerName="nova-cell0-conductor-db-sync" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.094399 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.101065 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.101379 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ntqbj" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.115625 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.227737 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprmc\" (UniqueName: \"kubernetes.io/projected/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-kube-api-access-cprmc\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.228039 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.228219 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.329801 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprmc\" (UniqueName: \"kubernetes.io/projected/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-kube-api-access-cprmc\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.329953 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.329993 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.333963 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.334893 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.353045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprmc\" (UniqueName: \"kubernetes.io/projected/383c8e08-0c0f-41fb-9574-cfa23aa2aad5-kube-api-access-cprmc\") pod \"nova-cell0-conductor-0\" (UID: \"383c8e08-0c0f-41fb-9574-cfa23aa2aad5\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.422535 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:10 crc kubenswrapper[4739]: I1008 22:10:10.952680 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:10:11 crc kubenswrapper[4739]: I1008 22:10:11.970555 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"383c8e08-0c0f-41fb-9574-cfa23aa2aad5","Type":"ContainerStarted","Data":"ea8aae7474ee9f09d664982ecf1a43aedcd8400888da6f7314f2edada0462d6e"} Oct 08 22:10:11 crc kubenswrapper[4739]: I1008 22:10:11.971269 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"383c8e08-0c0f-41fb-9574-cfa23aa2aad5","Type":"ContainerStarted","Data":"0feb1068c66ac2bad3671ae44081e1be343474bb85537b392c1b839c23156eef"} Oct 08 22:10:11 crc kubenswrapper[4739]: I1008 22:10:11.971326 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:12 crc kubenswrapper[4739]: I1008 22:10:12.009893 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.009865761 podStartE2EDuration="2.009865761s" podCreationTimestamp="2025-10-08 22:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:11.985974343 +0000 UTC m=+1311.811360133" watchObservedRunningTime="2025-10-08 22:10:12.009865761 +0000 UTC m=+1311.835251551" Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.470958 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.962671 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8m5qj"] Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.964803 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.971512 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.974506 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 22:10:20 crc kubenswrapper[4739]: I1008 22:10:20.979404 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8m5qj"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.073673 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.073896 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9kd\" (UniqueName: \"kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.074503 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.074604 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.158085 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.159976 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.163515 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.176414 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.176466 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.176546 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.176709 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9kd\" (UniqueName: \"kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.183872 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.185603 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.196839 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.203785 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.206618 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh9kd\" (UniqueName: \"kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd\") pod \"nova-cell0-cell-mapping-8m5qj\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.296570 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.297069 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.297227 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2pbl\" (UniqueName: \"kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.297085 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.325938 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.345337 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.355577 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.379852 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.404245 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.404462 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.404607 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2pbl\" (UniqueName: \"kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.415823 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.423845 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.456803 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2pbl\" (UniqueName: \"kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.474844 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.476689 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.482014 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.487615 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.508962 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.510033 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.510127 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.510164 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.510237 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfg9\" (UniqueName: \"kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.530218 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.531532 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.561923 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.573365 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.617827 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.617922 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.617969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618013 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618103 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618159 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618223 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rkh\" (UniqueName: \"kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618278 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfg9\" (UniqueName: \"kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618346 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618398 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618450 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4trzn\" (UniqueName: \"kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.618759 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.637238 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.667887 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.680094 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfg9\" (UniqueName: \"kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9\") pod \"nova-metadata-0\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.684428 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.693803 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.708459 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720268 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720316 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720338 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720374 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720434 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720453 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720473 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720504 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720524 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rkh\" (UniqueName: \"kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720558 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720596 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87lk\" (UniqueName: \"kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720831 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.720872 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4trzn\" (UniqueName: \"kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.721572 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.742719 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.752094 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.752551 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.757781 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4trzn\" (UniqueName: \"kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn\") pod \"nova-scheduler-0\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.761585 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.765691 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rkh\" (UniqueName: \"kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh\") pod \"nova-api-0\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.804524 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.825918 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.827031 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.832536 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.832951 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.833068 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h87lk\" (UniqueName: \"kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.833216 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.833261 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.834378 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.838898 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.842800 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.843655 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.844077 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.861829 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87lk\" (UniqueName: \"kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk\") pod \"dnsmasq-dns-845d6d6f59-9lc2r\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:21 crc kubenswrapper[4739]: I1008 22:10:21.891739 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.018195 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.221269 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8m5qj"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.320240 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.415032 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.420433 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:10:22 crc kubenswrapper[4739]: W1008 22:10:22.456645 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129fe9b3_6068_4d74_88a5_0a90733cd89e.slice/crio-fa871e2c4050a5595d5128fe110309d7a78a24fc2bb3380660b3747c45605b58 WatchSource:0}: Error finding container fa871e2c4050a5595d5128fe110309d7a78a24fc2bb3380660b3747c45605b58: Status 404 returned error can't find the container with id fa871e2c4050a5595d5128fe110309d7a78a24fc2bb3380660b3747c45605b58 Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.486525 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.555663 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5zvg9"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.556987 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.570593 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.570676 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.591373 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5zvg9"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.604858 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.617581 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.661544 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.661589 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.661657 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.661677 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckctg\" (UniqueName: \"kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.762959 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.762999 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckctg\" (UniqueName: \"kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.763102 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.763122 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.767037 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.767389 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.767865 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.784241 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckctg\" (UniqueName: \"kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg\") pod \"nova-cell1-conductor-db-sync-5zvg9\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:22 crc kubenswrapper[4739]: I1008 22:10:22.886955 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.124211 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerStarted","Data":"fa871e2c4050a5595d5128fe110309d7a78a24fc2bb3380660b3747c45605b58"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.130004 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8m5qj" event={"ID":"546d5b3a-342d-44f7-a179-12724fc783d0","Type":"ContainerStarted","Data":"5086c7809928752cf03517b21200024f71f295cfb4e5f17e1ab8a62bcf86a04f"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.130048 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8m5qj" event={"ID":"546d5b3a-342d-44f7-a179-12724fc783d0","Type":"ContainerStarted","Data":"4ecd064a42143e5e8a2523ee8bedc4f48b917ad722547f6bc24c8cb95579a302"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.138947 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca39eccb-9037-47a5-82f7-e83f5f4fa01e","Type":"ContainerStarted","Data":"98b12ea4171893a3c171bcce1ed56067948e9671445c553303012ca125c98b0f"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.143557 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31a21704-ce31-4b11-aa96-887a4c5f5cd7","Type":"ContainerStarted","Data":"58e69bfb722cb8bc15abd5819cb3b2d280b4a0b5815801987a7ff940e9595d72"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.150862 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerStarted","Data":"d0e9962612efd0e313c18b8a8c50edaf5bda221460a927dd2727a497f87ee8e8"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.153421 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8m5qj" podStartSLOduration=3.153402614 podStartE2EDuration="3.153402614s" podCreationTimestamp="2025-10-08 22:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:23.147387326 +0000 UTC m=+1322.972773066" watchObservedRunningTime="2025-10-08 22:10:23.153402614 +0000 UTC m=+1322.978788364" Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.154558 4739 generic.go:334] "Generic (PLEG): container finished" podID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerID="b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e" exitCode=0 Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.154605 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" event={"ID":"6043ead9-7b57-4d77-a1f6-71c7450bb6e6","Type":"ContainerDied","Data":"b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.154631 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" event={"ID":"6043ead9-7b57-4d77-a1f6-71c7450bb6e6","Type":"ContainerStarted","Data":"600bc4e5023de9321d93b6440b031366a8fe2138469c4b6934329af9ccbe6b0f"} Oct 08 22:10:23 crc kubenswrapper[4739]: I1008 22:10:23.343940 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5zvg9"] Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.180312 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" event={"ID":"6043ead9-7b57-4d77-a1f6-71c7450bb6e6","Type":"ContainerStarted","Data":"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa"} Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.181064 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.183259 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" event={"ID":"b4a90847-544d-45f9-b1c1-862b13309b66","Type":"ContainerStarted","Data":"ac36c2acfc0d70ec251af785a08a99591027cca49a1d0bbc24a22c0fdc93bdff"} Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.183330 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" event={"ID":"b4a90847-544d-45f9-b1c1-862b13309b66","Type":"ContainerStarted","Data":"12f23d94c64bd193bcb1f7d396ce32f9003db85dd33df9c8fd18010c18a40bf8"} Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.201091 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" podStartSLOduration=3.201072382 podStartE2EDuration="3.201072382s" podCreationTimestamp="2025-10-08 22:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:24.198782365 +0000 UTC m=+1324.024168115" watchObservedRunningTime="2025-10-08 22:10:24.201072382 +0000 UTC m=+1324.026458132" Oct 08 22:10:24 crc kubenswrapper[4739]: I1008 22:10:24.226515 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" podStartSLOduration=2.226497477 podStartE2EDuration="2.226497477s" podCreationTimestamp="2025-10-08 22:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:24.224599521 +0000 UTC m=+1324.049985271" watchObservedRunningTime="2025-10-08 22:10:24.226497477 +0000 UTC m=+1324.051883217" Oct 08 22:10:25 crc kubenswrapper[4739]: I1008 22:10:25.299365 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:10:25 crc kubenswrapper[4739]: I1008 22:10:25.312445 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:28 crc kubenswrapper[4739]: I1008 22:10:28.023504 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:28 crc kubenswrapper[4739]: I1008 22:10:28.024086 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="868268e8-4f60-4c9b-aa4d-7239fae44090" containerName="kube-state-metrics" containerID="cri-o://576b22ec8a8136b4908a088523224d9593320cdbbc2c82c7ef5afd65a040a8fc" gracePeriod=30 Oct 08 22:10:28 crc kubenswrapper[4739]: I1008 22:10:28.236713 4739 generic.go:334] "Generic (PLEG): container finished" podID="868268e8-4f60-4c9b-aa4d-7239fae44090" containerID="576b22ec8a8136b4908a088523224d9593320cdbbc2c82c7ef5afd65a040a8fc" exitCode=2 Oct 08 22:10:28 crc kubenswrapper[4739]: I1008 22:10:28.236821 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"868268e8-4f60-4c9b-aa4d-7239fae44090","Type":"ContainerDied","Data":"576b22ec8a8136b4908a088523224d9593320cdbbc2c82c7ef5afd65a040a8fc"} Oct 08 22:10:29 crc kubenswrapper[4739]: I1008 22:10:29.823653 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:10:29 crc kubenswrapper[4739]: I1008 22:10:29.942106 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmwfm\" (UniqueName: \"kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm\") pod \"868268e8-4f60-4c9b-aa4d-7239fae44090\" (UID: \"868268e8-4f60-4c9b-aa4d-7239fae44090\") " Oct 08 22:10:29 crc kubenswrapper[4739]: I1008 22:10:29.960479 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm" (OuterVolumeSpecName: "kube-api-access-bmwfm") pod "868268e8-4f60-4c9b-aa4d-7239fae44090" (UID: "868268e8-4f60-4c9b-aa4d-7239fae44090"). InnerVolumeSpecName "kube-api-access-bmwfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.045716 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmwfm\" (UniqueName: \"kubernetes.io/projected/868268e8-4f60-4c9b-aa4d-7239fae44090-kube-api-access-bmwfm\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.263037 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca39eccb-9037-47a5-82f7-e83f5f4fa01e","Type":"ContainerStarted","Data":"59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315"} Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.264926 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31a21704-ce31-4b11-aa96-887a4c5f5cd7","Type":"ContainerStarted","Data":"f83b40f773ac326aeca098563ba9d11868cb884d5400a920ca82927a65173646"} Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.266712 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerStarted","Data":"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8"} Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.268358 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"868268e8-4f60-4c9b-aa4d-7239fae44090","Type":"ContainerDied","Data":"d6d1426ca0fe6c96a2a0dada67ba6c0b9858396587e5a243320c190e73a01306"} Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.268397 4739 scope.go:117] "RemoveContainer" containerID="576b22ec8a8136b4908a088523224d9593320cdbbc2c82c7ef5afd65a040a8fc" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.268354 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.271710 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerStarted","Data":"cc2fbe463040b83a37f00e2510e03e74720f33c3d9535bd4ab9fbde82e1632ac"} Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.348323 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.348607 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-central-agent" containerID="cri-o://16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea" gracePeriod=30 Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.349627 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="proxy-httpd" containerID="cri-o://df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299" gracePeriod=30 Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.349678 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="sg-core" containerID="cri-o://0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7" gracePeriod=30 Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.349709 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-notification-agent" containerID="cri-o://e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a" gracePeriod=30 Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.650933 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.664628 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.675372 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:30 crc kubenswrapper[4739]: E1008 22:10:30.675797 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868268e8-4f60-4c9b-aa4d-7239fae44090" containerName="kube-state-metrics" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.675815 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="868268e8-4f60-4c9b-aa4d-7239fae44090" containerName="kube-state-metrics" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.676027 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="868268e8-4f60-4c9b-aa4d-7239fae44090" containerName="kube-state-metrics" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.676776 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.681173 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.685282 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.720217 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.761855 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.762003 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.762112 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.762374 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2p2\" (UniqueName: \"kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.866118 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2p2\" (UniqueName: \"kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.866434 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.866493 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.866583 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.873524 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.874321 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.878733 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.892432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2p2\" (UniqueName: \"kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2\") pod \"kube-state-metrics-0\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " pod="openstack/kube-state-metrics-0" Oct 08 22:10:30 crc kubenswrapper[4739]: I1008 22:10:30.996078 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.290223 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerStarted","Data":"9ddb1f04ed8add441c444387abad491cd5e734ad670ed9aedeb992df155a8e4b"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.290779 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-log" containerID="cri-o://cc2fbe463040b83a37f00e2510e03e74720f33c3d9535bd4ab9fbde82e1632ac" gracePeriod=30 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.291218 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-metadata" containerID="cri-o://9ddb1f04ed8add441c444387abad491cd5e734ad670ed9aedeb992df155a8e4b" gracePeriod=30 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302064 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerID="df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299" exitCode=0 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302097 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerID="0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7" exitCode=2 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302106 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerID="16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea" exitCode=0 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302183 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerDied","Data":"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302211 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerDied","Data":"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.302244 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerDied","Data":"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.304041 4739 generic.go:334] "Generic (PLEG): container finished" podID="546d5b3a-342d-44f7-a179-12724fc783d0" containerID="5086c7809928752cf03517b21200024f71f295cfb4e5f17e1ab8a62bcf86a04f" exitCode=0 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.304111 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8m5qj" event={"ID":"546d5b3a-342d-44f7-a179-12724fc783d0","Type":"ContainerDied","Data":"5086c7809928752cf03517b21200024f71f295cfb4e5f17e1ab8a62bcf86a04f"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.309009 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerStarted","Data":"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374"} Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.310771 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f83b40f773ac326aeca098563ba9d11868cb884d5400a920ca82927a65173646" gracePeriod=30 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.319552 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.3871554440000002 podStartE2EDuration="10.319529679s" podCreationTimestamp="2025-10-08 22:10:21 +0000 UTC" firstStartedPulling="2025-10-08 22:10:22.458478862 +0000 UTC m=+1322.283864612" lastFinishedPulling="2025-10-08 22:10:29.390853097 +0000 UTC m=+1329.216238847" observedRunningTime="2025-10-08 22:10:31.318396101 +0000 UTC m=+1331.143781851" watchObservedRunningTime="2025-10-08 22:10:31.319529679 +0000 UTC m=+1331.144915429" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.357018 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.529587881 podStartE2EDuration="10.356995032s" podCreationTimestamp="2025-10-08 22:10:21 +0000 UTC" firstStartedPulling="2025-10-08 22:10:22.620221964 +0000 UTC m=+1322.445607714" lastFinishedPulling="2025-10-08 22:10:29.447629105 +0000 UTC m=+1329.273014865" observedRunningTime="2025-10-08 22:10:31.352720917 +0000 UTC m=+1331.178106667" watchObservedRunningTime="2025-10-08 22:10:31.356995032 +0000 UTC m=+1331.182380772" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.372126 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.322756538 podStartE2EDuration="10.372093264s" podCreationTimestamp="2025-10-08 22:10:21 +0000 UTC" firstStartedPulling="2025-10-08 22:10:22.341398898 +0000 UTC m=+1322.166784648" lastFinishedPulling="2025-10-08 22:10:29.390735624 +0000 UTC m=+1329.216121374" observedRunningTime="2025-10-08 22:10:31.367826079 +0000 UTC m=+1331.193211839" watchObservedRunningTime="2025-10-08 22:10:31.372093264 +0000 UTC m=+1331.197479014" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.387381 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.592004599 podStartE2EDuration="10.38735771s" podCreationTimestamp="2025-10-08 22:10:21 +0000 UTC" firstStartedPulling="2025-10-08 22:10:22.604333003 +0000 UTC m=+1322.429718753" lastFinishedPulling="2025-10-08 22:10:29.399686104 +0000 UTC m=+1329.225071864" observedRunningTime="2025-10-08 22:10:31.382642683 +0000 UTC m=+1331.208028423" watchObservedRunningTime="2025-10-08 22:10:31.38735771 +0000 UTC m=+1331.212743460" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.472184 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:10:31 crc kubenswrapper[4739]: W1008 22:10:31.474421 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61db13e3_377c_41e4_bc39_bd2314224f6e.slice/crio-7794d91ad353a46a7851ae634f1ba79730d64fced2b03ad256911bba7563a815 WatchSource:0}: Error finding container 7794d91ad353a46a7851ae634f1ba79730d64fced2b03ad256911bba7563a815: Status 404 returned error can't find the container with id 7794d91ad353a46a7851ae634f1ba79730d64fced2b03ad256911bba7563a815 Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.488966 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.805510 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.805633 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.877554 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868268e8-4f60-4c9b-aa4d-7239fae44090" path="/var/lib/kubelet/pods/868268e8-4f60-4c9b-aa4d-7239fae44090/volumes" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.895830 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.895919 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.895941 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.895956 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:10:31 crc kubenswrapper[4739]: I1008 22:10:31.927292 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.020390 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.093087 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.094083 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="dnsmasq-dns" containerID="cri-o://4c42d0d8942aac862acb37940099ee81c043d7186ac7a77dcde4379b187bc350" gracePeriod=10 Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.332875 4739 generic.go:334] "Generic (PLEG): container finished" podID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerID="9ddb1f04ed8add441c444387abad491cd5e734ad670ed9aedeb992df155a8e4b" exitCode=0 Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.332911 4739 generic.go:334] "Generic (PLEG): container finished" podID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerID="cc2fbe463040b83a37f00e2510e03e74720f33c3d9535bd4ab9fbde82e1632ac" exitCode=143 Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.332968 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerDied","Data":"9ddb1f04ed8add441c444387abad491cd5e734ad670ed9aedeb992df155a8e4b"} Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.333037 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerDied","Data":"cc2fbe463040b83a37f00e2510e03e74720f33c3d9535bd4ab9fbde82e1632ac"} Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.336462 4739 generic.go:334] "Generic (PLEG): container finished" podID="b13865e1-4812-4d07-8b31-e488ab164399" containerID="4c42d0d8942aac862acb37940099ee81c043d7186ac7a77dcde4379b187bc350" exitCode=0 Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.336536 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" event={"ID":"b13865e1-4812-4d07-8b31-e488ab164399","Type":"ContainerDied","Data":"4c42d0d8942aac862acb37940099ee81c043d7186ac7a77dcde4379b187bc350"} Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.341874 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"61db13e3-377c-41e4-bc39-bd2314224f6e","Type":"ContainerStarted","Data":"7794d91ad353a46a7851ae634f1ba79730d64fced2b03ad256911bba7563a815"} Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.382912 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.917579 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.918126 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.947838 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:32 crc kubenswrapper[4739]: I1008 22:10:32.954931 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.071248 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.071412 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh9kd\" (UniqueName: \"kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd\") pod \"546d5b3a-342d-44f7-a179-12724fc783d0\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.071473 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.071505 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data\") pod \"546d5b3a-342d-44f7-a179-12724fc783d0\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.071655 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.072560 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts\") pod \"546d5b3a-342d-44f7-a179-12724fc783d0\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.072957 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.072994 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbwp\" (UniqueName: \"kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.073084 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb\") pod \"b13865e1-4812-4d07-8b31-e488ab164399\" (UID: \"b13865e1-4812-4d07-8b31-e488ab164399\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.073189 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle\") pod \"546d5b3a-342d-44f7-a179-12724fc783d0\" (UID: \"546d5b3a-342d-44f7-a179-12724fc783d0\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.082013 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp" (OuterVolumeSpecName: "kube-api-access-5kbwp") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "kube-api-access-5kbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.086026 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts" (OuterVolumeSpecName: "scripts") pod "546d5b3a-342d-44f7-a179-12724fc783d0" (UID: "546d5b3a-342d-44f7-a179-12724fc783d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.129309 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data" (OuterVolumeSpecName: "config-data") pod "546d5b3a-342d-44f7-a179-12724fc783d0" (UID: "546d5b3a-342d-44f7-a179-12724fc783d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.152371 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.153639 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.158543 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "546d5b3a-342d-44f7-a179-12724fc783d0" (UID: "546d5b3a-342d-44f7-a179-12724fc783d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.164630 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.170706 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.173637 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config" (OuterVolumeSpecName: "config") pod "b13865e1-4812-4d07-8b31-e488ab164399" (UID: "b13865e1-4812-4d07-8b31-e488ab164399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176547 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176582 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176596 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176608 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176618 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176629 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbwp\" (UniqueName: \"kubernetes.io/projected/b13865e1-4812-4d07-8b31-e488ab164399-kube-api-access-5kbwp\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176641 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176650 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d5b3a-342d-44f7-a179-12724fc783d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.176659 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13865e1-4812-4d07-8b31-e488ab164399-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.188297 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd" (OuterVolumeSpecName: "kube-api-access-sh9kd") pod "546d5b3a-342d-44f7-a179-12724fc783d0" (UID: "546d5b3a-342d-44f7-a179-12724fc783d0"). InnerVolumeSpecName "kube-api-access-sh9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.278925 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh9kd\" (UniqueName: \"kubernetes.io/projected/546d5b3a-342d-44f7-a179-12724fc783d0-kube-api-access-sh9kd\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.355339 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" event={"ID":"b13865e1-4812-4d07-8b31-e488ab164399","Type":"ContainerDied","Data":"3992a95efa53abdb0a55ea796466ff860b076db04b890e459acc53d700cd762e"} Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.355413 4739 scope.go:117] "RemoveContainer" containerID="4c42d0d8942aac862acb37940099ee81c043d7186ac7a77dcde4379b187bc350" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.355443 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-tfvvh" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.376157 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8m5qj" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.376173 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8m5qj" event={"ID":"546d5b3a-342d-44f7-a179-12724fc783d0","Type":"ContainerDied","Data":"4ecd064a42143e5e8a2523ee8bedc4f48b917ad722547f6bc24c8cb95579a302"} Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.376229 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ecd064a42143e5e8a2523ee8bedc4f48b917ad722547f6bc24c8cb95579a302" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.411797 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.418951 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-tfvvh"] Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.531916 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.559436 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.700448 4739 scope.go:117] "RemoveContainer" containerID="360281c6c8de55f2175d23d9e89fd27ac51a8066ce237b9fded99cf9adcfd425" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.834102 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13865e1-4812-4d07-8b31-e488ab164399" path="/var/lib/kubelet/pods/b13865e1-4812-4d07-8b31-e488ab164399/volumes" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.890790 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.994573 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data\") pod \"129fe9b3-6068-4d74-88a5-0a90733cd89e\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.994610 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nfg9\" (UniqueName: \"kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9\") pod \"129fe9b3-6068-4d74-88a5-0a90733cd89e\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.994698 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle\") pod \"129fe9b3-6068-4d74-88a5-0a90733cd89e\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.994722 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs\") pod \"129fe9b3-6068-4d74-88a5-0a90733cd89e\" (UID: \"129fe9b3-6068-4d74-88a5-0a90733cd89e\") " Oct 08 22:10:33 crc kubenswrapper[4739]: I1008 22:10:33.995517 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs" (OuterVolumeSpecName: "logs") pod "129fe9b3-6068-4d74-88a5-0a90733cd89e" (UID: "129fe9b3-6068-4d74-88a5-0a90733cd89e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.002368 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9" (OuterVolumeSpecName: "kube-api-access-5nfg9") pod "129fe9b3-6068-4d74-88a5-0a90733cd89e" (UID: "129fe9b3-6068-4d74-88a5-0a90733cd89e"). InnerVolumeSpecName "kube-api-access-5nfg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.096941 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129fe9b3-6068-4d74-88a5-0a90733cd89e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.096988 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nfg9\" (UniqueName: \"kubernetes.io/projected/129fe9b3-6068-4d74-88a5-0a90733cd89e-kube-api-access-5nfg9\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.130415 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "129fe9b3-6068-4d74-88a5-0a90733cd89e" (UID: "129fe9b3-6068-4d74-88a5-0a90733cd89e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.130599 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data" (OuterVolumeSpecName: "config-data") pod "129fe9b3-6068-4d74-88a5-0a90733cd89e" (UID: "129fe9b3-6068-4d74-88a5-0a90733cd89e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.198603 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.198642 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129fe9b3-6068-4d74-88a5-0a90733cd89e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.396437 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.396446 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"129fe9b3-6068-4d74-88a5-0a90733cd89e","Type":"ContainerDied","Data":"fa871e2c4050a5595d5128fe110309d7a78a24fc2bb3380660b3747c45605b58"} Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.396985 4739 scope.go:117] "RemoveContainer" containerID="9ddb1f04ed8add441c444387abad491cd5e734ad670ed9aedeb992df155a8e4b" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.396512 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-api" containerID="cri-o://f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374" gracePeriod=30 Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.397501 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-log" containerID="cri-o://40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8" gracePeriod=30 Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.445510 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.466312 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.476740 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:34 crc kubenswrapper[4739]: E1008 22:10:34.477384 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546d5b3a-342d-44f7-a179-12724fc783d0" containerName="nova-manage" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477418 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="546d5b3a-342d-44f7-a179-12724fc783d0" containerName="nova-manage" Oct 08 22:10:34 crc kubenswrapper[4739]: E1008 22:10:34.477448 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-log" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477457 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-log" Oct 08 22:10:34 crc kubenswrapper[4739]: E1008 22:10:34.477492 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-metadata" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477501 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-metadata" Oct 08 22:10:34 crc kubenswrapper[4739]: E1008 22:10:34.477510 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="dnsmasq-dns" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477518 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="dnsmasq-dns" Oct 08 22:10:34 crc kubenswrapper[4739]: E1008 22:10:34.477538 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="init" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477547 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="init" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477800 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="546d5b3a-342d-44f7-a179-12724fc783d0" containerName="nova-manage" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477828 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-log" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477859 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13865e1-4812-4d07-8b31-e488ab164399" containerName="dnsmasq-dns" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.477870 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" containerName="nova-metadata-metadata" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.479326 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.492489 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.501542 4739 scope.go:117] "RemoveContainer" containerID="cc2fbe463040b83a37f00e2510e03e74720f33c3d9535bd4ab9fbde82e1632ac" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.502280 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.511460 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.615952 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.616184 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.616247 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf4c\" (UniqueName: \"kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.616289 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.616373 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.718598 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.718686 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.718782 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.718824 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf4c\" (UniqueName: \"kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.718851 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.719946 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.724465 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.724921 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.726013 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.749075 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf4c\" (UniqueName: \"kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c\") pod \"nova-metadata-0\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " pod="openstack/nova-metadata-0" Oct 08 22:10:34 crc kubenswrapper[4739]: I1008 22:10:34.821544 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.295655 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:10:35 crc kubenswrapper[4739]: W1008 22:10:35.309133 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad230d2e_c78a_4e94_96d4_a44f02da6033.slice/crio-ad16e4c75a2b01918f5a49d9e6cd89f77d432dbcf68bf3de130382bc7a1f5637 WatchSource:0}: Error finding container ad16e4c75a2b01918f5a49d9e6cd89f77d432dbcf68bf3de130382bc7a1f5637: Status 404 returned error can't find the container with id ad16e4c75a2b01918f5a49d9e6cd89f77d432dbcf68bf3de130382bc7a1f5637 Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.413650 4739 generic.go:334] "Generic (PLEG): container finished" podID="7e698574-e76d-4a75-b963-f8849155573f" containerID="40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8" exitCode=143 Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.413717 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerDied","Data":"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8"} Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.415549 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerStarted","Data":"ad16e4c75a2b01918f5a49d9e6cd89f77d432dbcf68bf3de130382bc7a1f5637"} Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.419065 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"61db13e3-377c-41e4-bc39-bd2314224f6e","Type":"ContainerStarted","Data":"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c"} Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.419257 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerName="nova-scheduler-scheduler" containerID="cri-o://59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" gracePeriod=30 Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.445554 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.651281933 podStartE2EDuration="5.4455334s" podCreationTimestamp="2025-10-08 22:10:30 +0000 UTC" firstStartedPulling="2025-10-08 22:10:31.479202841 +0000 UTC m=+1331.304588591" lastFinishedPulling="2025-10-08 22:10:34.273454308 +0000 UTC m=+1334.098840058" observedRunningTime="2025-10-08 22:10:35.436167919 +0000 UTC m=+1335.261553679" watchObservedRunningTime="2025-10-08 22:10:35.4455334 +0000 UTC m=+1335.270919150" Oct 08 22:10:35 crc kubenswrapper[4739]: I1008 22:10:35.851480 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129fe9b3-6068-4d74-88a5-0a90733cd89e" path="/var/lib/kubelet/pods/129fe9b3-6068-4d74-88a5-0a90733cd89e/volumes" Oct 08 22:10:36 crc kubenswrapper[4739]: I1008 22:10:36.431898 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerStarted","Data":"6a4220f817984c8e7cccd3d58a15d7e4884187b7ae3d2eb44430bb7d142a196d"} Oct 08 22:10:36 crc kubenswrapper[4739]: I1008 22:10:36.432295 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerStarted","Data":"da07b2427aa2cc09b0011d49395fcc427dbc782ca4cfcd25a0ea77952286f3b8"} Oct 08 22:10:36 crc kubenswrapper[4739]: I1008 22:10:36.432321 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 22:10:36 crc kubenswrapper[4739]: I1008 22:10:36.451408 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.451391348 podStartE2EDuration="2.451391348s" podCreationTimestamp="2025-10-08 22:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:36.450720902 +0000 UTC m=+1336.276106642" watchObservedRunningTime="2025-10-08 22:10:36.451391348 +0000 UTC m=+1336.276777098" Oct 08 22:10:36 crc kubenswrapper[4739]: E1008 22:10:36.899770 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:10:36 crc kubenswrapper[4739]: E1008 22:10:36.905478 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:10:36 crc kubenswrapper[4739]: E1008 22:10:36.906789 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:10:36 crc kubenswrapper[4739]: E1008 22:10:36.906836 4739 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerName="nova-scheduler-scheduler" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.054449 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.170946 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.170987 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171044 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171134 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5zv7\" (UniqueName: \"kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171193 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171268 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171315 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts\") pod \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\" (UID: \"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7\") " Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.171684 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.173257 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.179611 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts" (OuterVolumeSpecName: "scripts") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.181853 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7" (OuterVolumeSpecName: "kube-api-access-w5zv7") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "kube-api-access-w5zv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.206547 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.257367 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273871 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273895 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273905 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273915 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273923 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.273932 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5zv7\" (UniqueName: \"kubernetes.io/projected/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-kube-api-access-w5zv7\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.313563 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data" (OuterVolumeSpecName: "config-data") pod "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" (UID: "e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.375468 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.445073 4739 generic.go:334] "Generic (PLEG): container finished" podID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerID="e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a" exitCode=0 Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.445181 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerDied","Data":"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a"} Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.445215 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7","Type":"ContainerDied","Data":"bd73912e946492560f056335d7dc263fbd89e5629ed52525b588aae710423864"} Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.445237 4739 scope.go:117] "RemoveContainer" containerID="df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.445410 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.452631 4739 generic.go:334] "Generic (PLEG): container finished" podID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerID="59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" exitCode=0 Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.453315 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca39eccb-9037-47a5-82f7-e83f5f4fa01e","Type":"ContainerDied","Data":"59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315"} Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.473044 4739 scope.go:117] "RemoveContainer" containerID="0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.497215 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.505424 4739 scope.go:117] "RemoveContainer" containerID="e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.512850 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520074 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.520517 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="sg-core" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520535 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="sg-core" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.520565 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="proxy-httpd" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520571 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="proxy-httpd" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.520588 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-central-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520595 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-central-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.520615 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-notification-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520621 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-notification-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520802 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="proxy-httpd" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.520814 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-notification-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.521021 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="ceilometer-central-agent" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.521030 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" containerName="sg-core" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.522783 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.527583 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.527977 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.528294 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.536075 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.536221 4739 scope.go:117] "RemoveContainer" containerID="16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.566792 4739 scope.go:117] "RemoveContainer" containerID="df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.567416 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299\": container with ID starting with df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299 not found: ID does not exist" containerID="df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.567469 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299"} err="failed to get container status \"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299\": rpc error: code = NotFound desc = could not find container \"df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299\": container with ID starting with df89ef400afd98d848857483d38183d7bc7bef52e5334f2d5e7f2db8a4681299 not found: ID does not exist" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.567513 4739 scope.go:117] "RemoveContainer" containerID="0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.568069 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7\": container with ID starting with 0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7 not found: ID does not exist" containerID="0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.568126 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7"} err="failed to get container status \"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7\": rpc error: code = NotFound desc = could not find container \"0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7\": container with ID starting with 0374476004beecf62678814abae5c7a34059fd6c4c6abbb81eeec347edad50a7 not found: ID does not exist" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.568187 4739 scope.go:117] "RemoveContainer" containerID="e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.571160 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a\": container with ID starting with e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a not found: ID does not exist" containerID="e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.571198 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a"} err="failed to get container status \"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a\": rpc error: code = NotFound desc = could not find container \"e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a\": container with ID starting with e7b55fb9ee6f958053a25c6f38b4cc495515ad964a27844273b01803237d3b3a not found: ID does not exist" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.571217 4739 scope.go:117] "RemoveContainer" containerID="16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea" Oct 08 22:10:37 crc kubenswrapper[4739]: E1008 22:10:37.571539 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea\": container with ID starting with 16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea not found: ID does not exist" containerID="16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.571559 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea"} err="failed to get container status \"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea\": rpc error: code = NotFound desc = could not find container \"16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea\": container with ID starting with 16fec44f9d8df509acbefa9a7512a3ee661aee2436f06d93c8f36742cf6615ea not found: ID does not exist" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.683376 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.683649 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.683689 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.683741 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.683808 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.684303 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsjd\" (UniqueName: \"kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.684407 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.684591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786785 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786839 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786860 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786892 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786961 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsjd\" (UniqueName: \"kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.786987 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.787027 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.787057 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.788516 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.788680 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.793666 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.794254 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.795595 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.797397 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.797724 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.820632 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsjd\" (UniqueName: \"kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd\") pod \"ceilometer-0\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " pod="openstack/ceilometer-0" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.843196 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7" path="/var/lib/kubelet/pods/e4dfccc8-0dab-4e91-ad1a-54fcc9ee19a7/volumes" Oct 08 22:10:37 crc kubenswrapper[4739]: I1008 22:10:37.862580 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.418776 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.474472 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca39eccb-9037-47a5-82f7-e83f5f4fa01e","Type":"ContainerDied","Data":"98b12ea4171893a3c171bcce1ed56067948e9671445c553303012ca125c98b0f"} Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.474539 4739 scope.go:117] "RemoveContainer" containerID="59fed5dee9afbd8de208186d8b87b1170bf83523ff20911ebd1b14e7f7326315" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.474544 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.504405 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4trzn\" (UniqueName: \"kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn\") pod \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.504596 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data\") pod \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.504697 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle\") pod \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\" (UID: \"ca39eccb-9037-47a5-82f7-e83f5f4fa01e\") " Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.505901 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.513333 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn" (OuterVolumeSpecName: "kube-api-access-4trzn") pod "ca39eccb-9037-47a5-82f7-e83f5f4fa01e" (UID: "ca39eccb-9037-47a5-82f7-e83f5f4fa01e"). InnerVolumeSpecName "kube-api-access-4trzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.552261 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data" (OuterVolumeSpecName: "config-data") pod "ca39eccb-9037-47a5-82f7-e83f5f4fa01e" (UID: "ca39eccb-9037-47a5-82f7-e83f5f4fa01e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.588636 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca39eccb-9037-47a5-82f7-e83f5f4fa01e" (UID: "ca39eccb-9037-47a5-82f7-e83f5f4fa01e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.607352 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4trzn\" (UniqueName: \"kubernetes.io/projected/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-kube-api-access-4trzn\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.607407 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.607419 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca39eccb-9037-47a5-82f7-e83f5f4fa01e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.809618 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.824036 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.841221 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:38 crc kubenswrapper[4739]: E1008 22:10:38.842032 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerName="nova-scheduler-scheduler" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.842105 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerName="nova-scheduler-scheduler" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.842443 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" containerName="nova-scheduler-scheduler" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.843376 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.846211 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:10:38 crc kubenswrapper[4739]: I1008 22:10:38.863619 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.015721 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvs8g\" (UniqueName: \"kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.015808 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.016011 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.118076 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.118489 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvs8g\" (UniqueName: \"kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.118635 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.122501 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.138916 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.144228 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvs8g\" (UniqueName: \"kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g\") pod \"nova-scheduler-0\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.162463 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.457306 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.506810 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerStarted","Data":"fb68cb0d0af06f8e9a7bf48d4da51be0dbd9be51d90739c9b52c791b2b844d6c"} Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.508376 4739 generic.go:334] "Generic (PLEG): container finished" podID="7e698574-e76d-4a75-b963-f8849155573f" containerID="f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374" exitCode=0 Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.508421 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerDied","Data":"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374"} Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.508437 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e698574-e76d-4a75-b963-f8849155573f","Type":"ContainerDied","Data":"d0e9962612efd0e313c18b8a8c50edaf5bda221460a927dd2727a497f87ee8e8"} Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.508454 4739 scope.go:117] "RemoveContainer" containerID="f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.508609 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.581098 4739 scope.go:117] "RemoveContainer" containerID="40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.614167 4739 scope.go:117] "RemoveContainer" containerID="f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374" Oct 08 22:10:39 crc kubenswrapper[4739]: E1008 22:10:39.614663 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374\": container with ID starting with f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374 not found: ID does not exist" containerID="f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.614698 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374"} err="failed to get container status \"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374\": rpc error: code = NotFound desc = could not find container \"f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374\": container with ID starting with f22a5529ffb51f28da94010f54605e1b02374e07c52344d87a6b89726fcca374 not found: ID does not exist" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.614721 4739 scope.go:117] "RemoveContainer" containerID="40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8" Oct 08 22:10:39 crc kubenswrapper[4739]: E1008 22:10:39.615261 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8\": container with ID starting with 40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8 not found: ID does not exist" containerID="40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.615345 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8"} err="failed to get container status \"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8\": rpc error: code = NotFound desc = could not find container \"40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8\": container with ID starting with 40d1396532872efaf35e5279d0a81fae1c7f65493bf3d17c48202dfedddb24b8 not found: ID does not exist" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.638257 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle\") pod \"7e698574-e76d-4a75-b963-f8849155573f\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.638503 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data\") pod \"7e698574-e76d-4a75-b963-f8849155573f\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.638666 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rkh\" (UniqueName: \"kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh\") pod \"7e698574-e76d-4a75-b963-f8849155573f\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.638732 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs\") pod \"7e698574-e76d-4a75-b963-f8849155573f\" (UID: \"7e698574-e76d-4a75-b963-f8849155573f\") " Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.640109 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs" (OuterVolumeSpecName: "logs") pod "7e698574-e76d-4a75-b963-f8849155573f" (UID: "7e698574-e76d-4a75-b963-f8849155573f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.643928 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh" (OuterVolumeSpecName: "kube-api-access-c4rkh") pod "7e698574-e76d-4a75-b963-f8849155573f" (UID: "7e698574-e76d-4a75-b963-f8849155573f"). InnerVolumeSpecName "kube-api-access-c4rkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.680405 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e698574-e76d-4a75-b963-f8849155573f" (UID: "7e698574-e76d-4a75-b963-f8849155573f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.682533 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data" (OuterVolumeSpecName: "config-data") pod "7e698574-e76d-4a75-b963-f8849155573f" (UID: "7e698574-e76d-4a75-b963-f8849155573f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.741580 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.741622 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e698574-e76d-4a75-b963-f8849155573f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.741637 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rkh\" (UniqueName: \"kubernetes.io/projected/7e698574-e76d-4a75-b963-f8849155573f-kube-api-access-c4rkh\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.741652 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e698574-e76d-4a75-b963-f8849155573f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.844445 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca39eccb-9037-47a5-82f7-e83f5f4fa01e" path="/var/lib/kubelet/pods/ca39eccb-9037-47a5-82f7-e83f5f4fa01e/volumes" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.845894 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.845919 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.856584 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.861838 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.878880 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:39 crc kubenswrapper[4739]: E1008 22:10:39.879302 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-log" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.879318 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-log" Oct 08 22:10:39 crc kubenswrapper[4739]: E1008 22:10:39.879337 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-api" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.879343 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-api" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.879537 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-log" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.879563 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e698574-e76d-4a75-b963-f8849155573f" containerName="nova-api-api" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.880759 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.886728 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.895090 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:39 crc kubenswrapper[4739]: I1008 22:10:39.958578 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.047183 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.047763 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jlgq\" (UniqueName: \"kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.047822 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.047849 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.149582 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jlgq\" (UniqueName: \"kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.149652 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.149677 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.149701 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.150218 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.157787 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.160599 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.175217 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jlgq\" (UniqueName: \"kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq\") pod \"nova-api-0\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.220903 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.523863 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerStarted","Data":"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab"} Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.526084 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd005429-201e-42f1-af67-f89e02d19b7a","Type":"ContainerStarted","Data":"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a"} Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.526175 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd005429-201e-42f1-af67-f89e02d19b7a","Type":"ContainerStarted","Data":"e3aa7f5198e866cf39a8089653b91a54ffc676682bd95804b4ac98e23ea121ac"} Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.549924 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.549898252 podStartE2EDuration="2.549898252s" podCreationTimestamp="2025-10-08 22:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:40.54494198 +0000 UTC m=+1340.370327740" watchObservedRunningTime="2025-10-08 22:10:40.549898252 +0000 UTC m=+1340.375284002" Oct 08 22:10:40 crc kubenswrapper[4739]: I1008 22:10:40.728693 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.065318 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.561123 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerStarted","Data":"2a07512189b575bf66e9bee1b24cdbc66140e6ceb36a9dd121f9496cd20033f6"} Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.561183 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerStarted","Data":"c0ec13227ae3cffe87add033c773462938f8983b35641a6b35765629b85eb641"} Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.561195 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerStarted","Data":"51afdcf0a2e1e1e7f71537380cfa736c12e19d8e66ab259f5ef564c0a8f1f0f2"} Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.569681 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerStarted","Data":"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2"} Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.591996 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.591968182 podStartE2EDuration="2.591968182s" podCreationTimestamp="2025-10-08 22:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:41.58134297 +0000 UTC m=+1341.406728740" watchObservedRunningTime="2025-10-08 22:10:41.591968182 +0000 UTC m=+1341.417353932" Oct 08 22:10:41 crc kubenswrapper[4739]: I1008 22:10:41.834108 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e698574-e76d-4a75-b963-f8849155573f" path="/var/lib/kubelet/pods/7e698574-e76d-4a75-b963-f8849155573f/volumes" Oct 08 22:10:42 crc kubenswrapper[4739]: I1008 22:10:42.588750 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerStarted","Data":"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432"} Oct 08 22:10:43 crc kubenswrapper[4739]: I1008 22:10:43.600776 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerStarted","Data":"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5"} Oct 08 22:10:43 crc kubenswrapper[4739]: I1008 22:10:43.600981 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:10:43 crc kubenswrapper[4739]: I1008 22:10:43.650539 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.44844765 podStartE2EDuration="6.650518133s" podCreationTimestamp="2025-10-08 22:10:37 +0000 UTC" firstStartedPulling="2025-10-08 22:10:38.525442791 +0000 UTC m=+1338.350828551" lastFinishedPulling="2025-10-08 22:10:42.727513284 +0000 UTC m=+1342.552899034" observedRunningTime="2025-10-08 22:10:43.642718491 +0000 UTC m=+1343.468104251" watchObservedRunningTime="2025-10-08 22:10:43.650518133 +0000 UTC m=+1343.475903883" Oct 08 22:10:44 crc kubenswrapper[4739]: I1008 22:10:44.162813 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:10:44 crc kubenswrapper[4739]: I1008 22:10:44.822464 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:10:44 crc kubenswrapper[4739]: I1008 22:10:44.822917 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:10:45 crc kubenswrapper[4739]: I1008 22:10:45.835519 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:45 crc kubenswrapper[4739]: I1008 22:10:45.835538 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:47 crc kubenswrapper[4739]: I1008 22:10:47.652815 4739 generic.go:334] "Generic (PLEG): container finished" podID="b4a90847-544d-45f9-b1c1-862b13309b66" containerID="ac36c2acfc0d70ec251af785a08a99591027cca49a1d0bbc24a22c0fdc93bdff" exitCode=0 Oct 08 22:10:47 crc kubenswrapper[4739]: I1008 22:10:47.652918 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" event={"ID":"b4a90847-544d-45f9-b1c1-862b13309b66","Type":"ContainerDied","Data":"ac36c2acfc0d70ec251af785a08a99591027cca49a1d0bbc24a22c0fdc93bdff"} Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.089792 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.163503 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.174899 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle\") pod \"b4a90847-544d-45f9-b1c1-862b13309b66\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.174985 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data\") pod \"b4a90847-544d-45f9-b1c1-862b13309b66\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.175040 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts\") pod \"b4a90847-544d-45f9-b1c1-862b13309b66\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.175244 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckctg\" (UniqueName: \"kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg\") pod \"b4a90847-544d-45f9-b1c1-862b13309b66\" (UID: \"b4a90847-544d-45f9-b1c1-862b13309b66\") " Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.185515 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg" (OuterVolumeSpecName: "kube-api-access-ckctg") pod "b4a90847-544d-45f9-b1c1-862b13309b66" (UID: "b4a90847-544d-45f9-b1c1-862b13309b66"). InnerVolumeSpecName "kube-api-access-ckctg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.188379 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts" (OuterVolumeSpecName: "scripts") pod "b4a90847-544d-45f9-b1c1-862b13309b66" (UID: "b4a90847-544d-45f9-b1c1-862b13309b66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.203255 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.207339 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4a90847-544d-45f9-b1c1-862b13309b66" (UID: "b4a90847-544d-45f9-b1c1-862b13309b66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.209006 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data" (OuterVolumeSpecName: "config-data") pod "b4a90847-544d-45f9-b1c1-862b13309b66" (UID: "b4a90847-544d-45f9-b1c1-862b13309b66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.277807 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckctg\" (UniqueName: \"kubernetes.io/projected/b4a90847-544d-45f9-b1c1-862b13309b66-kube-api-access-ckctg\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.277872 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.277900 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.277925 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4a90847-544d-45f9-b1c1-862b13309b66-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.680430 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" event={"ID":"b4a90847-544d-45f9-b1c1-862b13309b66","Type":"ContainerDied","Data":"12f23d94c64bd193bcb1f7d396ce32f9003db85dd33df9c8fd18010c18a40bf8"} Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.680479 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12f23d94c64bd193bcb1f7d396ce32f9003db85dd33df9c8fd18010c18a40bf8" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.680510 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5zvg9" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.739780 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.802651 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:10:49 crc kubenswrapper[4739]: E1008 22:10:49.803139 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a90847-544d-45f9-b1c1-862b13309b66" containerName="nova-cell1-conductor-db-sync" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.803181 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a90847-544d-45f9-b1c1-862b13309b66" containerName="nova-cell1-conductor-db-sync" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.803454 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a90847-544d-45f9-b1c1-862b13309b66" containerName="nova-cell1-conductor-db-sync" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.804206 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.807341 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.891115 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.891319 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.891383 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4v4q\" (UniqueName: \"kubernetes.io/projected/48ac9d79-441d-4277-bf99-a8dc4ec2213c-kube-api-access-k4v4q\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.894109 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.993547 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.993642 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4v4q\" (UniqueName: \"kubernetes.io/projected/48ac9d79-441d-4277-bf99-a8dc4ec2213c-kube-api-access-k4v4q\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.993681 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:49 crc kubenswrapper[4739]: I1008 22:10:49.998041 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.011412 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ac9d79-441d-4277-bf99-a8dc4ec2213c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.033032 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4v4q\" (UniqueName: \"kubernetes.io/projected/48ac9d79-441d-4277-bf99-a8dc4ec2213c-kube-api-access-k4v4q\") pod \"nova-cell1-conductor-0\" (UID: \"48ac9d79-441d-4277-bf99-a8dc4ec2213c\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.180306 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.222318 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.222378 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:10:50 crc kubenswrapper[4739]: I1008 22:10:50.774592 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.263614 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.304433 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.701096 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"48ac9d79-441d-4277-bf99-a8dc4ec2213c","Type":"ContainerStarted","Data":"8eb5ed253229819ce73b11872468ad71783259f27db5dfcf842dc845acabf1d2"} Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.701222 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"48ac9d79-441d-4277-bf99-a8dc4ec2213c","Type":"ContainerStarted","Data":"fa031f37fff04f8ddd9e36ab6dac63d895b7e4960af83f5d6e4430f8919e7fb3"} Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.701338 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 22:10:51 crc kubenswrapper[4739]: I1008 22:10:51.737541 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.73751832 podStartE2EDuration="2.73751832s" podCreationTimestamp="2025-10-08 22:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:10:51.726036677 +0000 UTC m=+1351.551422447" watchObservedRunningTime="2025-10-08 22:10:51.73751832 +0000 UTC m=+1351.562904090" Oct 08 22:10:54 crc kubenswrapper[4739]: I1008 22:10:54.829208 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:10:54 crc kubenswrapper[4739]: I1008 22:10:54.837224 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:10:54 crc kubenswrapper[4739]: I1008 22:10:54.838220 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:10:55 crc kubenswrapper[4739]: I1008 22:10:55.760646 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.228016 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.228900 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.229655 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.229709 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.235490 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.242069 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.242196 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.485513 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.487240 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.504011 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551366 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551435 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551470 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551510 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551589 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.551631 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b8p\" (UniqueName: \"kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.654753 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.654867 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.654928 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.654994 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.655046 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.655106 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b8p\" (UniqueName: \"kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.655588 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.656096 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.656271 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.657139 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.658633 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.677104 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b8p\" (UniqueName: \"kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p\") pod \"dnsmasq-dns-59cf4bdb65-bnpvs\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:00 crc kubenswrapper[4739]: I1008 22:11:00.811984 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:01 crc kubenswrapper[4739]: I1008 22:11:01.224756 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:11:01 crc kubenswrapper[4739]: W1008 22:11:01.233888 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf35b9a8c_8e6f_4170_b1d9_20c4628d478b.slice/crio-324baef53a83316e6711f830ea64e8e97adcfd52c9b58cb2c19deef117214a7a WatchSource:0}: Error finding container 324baef53a83316e6711f830ea64e8e97adcfd52c9b58cb2c19deef117214a7a: Status 404 returned error can't find the container with id 324baef53a83316e6711f830ea64e8e97adcfd52c9b58cb2c19deef117214a7a Oct 08 22:11:01 crc kubenswrapper[4739]: I1008 22:11:01.845506 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerStarted","Data":"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d"} Oct 08 22:11:01 crc kubenswrapper[4739]: I1008 22:11:01.846409 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerStarted","Data":"324baef53a83316e6711f830ea64e8e97adcfd52c9b58cb2c19deef117214a7a"} Oct 08 22:11:01 crc kubenswrapper[4739]: I1008 22:11:01.848364 4739 generic.go:334] "Generic (PLEG): container finished" podID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" containerID="f83b40f773ac326aeca098563ba9d11868cb884d5400a920ca82927a65173646" exitCode=137 Oct 08 22:11:01 crc kubenswrapper[4739]: I1008 22:11:01.848471 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31a21704-ce31-4b11-aa96-887a4c5f5cd7","Type":"ContainerDied","Data":"f83b40f773ac326aeca098563ba9d11868cb884d5400a920ca82927a65173646"} Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.077636 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.238804 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2pbl\" (UniqueName: \"kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl\") pod \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.239283 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data\") pod \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.239454 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle\") pod \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\" (UID: \"31a21704-ce31-4b11-aa96-887a4c5f5cd7\") " Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.256385 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl" (OuterVolumeSpecName: "kube-api-access-b2pbl") pod "31a21704-ce31-4b11-aa96-887a4c5f5cd7" (UID: "31a21704-ce31-4b11-aa96-887a4c5f5cd7"). InnerVolumeSpecName "kube-api-access-b2pbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.275603 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a21704-ce31-4b11-aa96-887a4c5f5cd7" (UID: "31a21704-ce31-4b11-aa96-887a4c5f5cd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.279226 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data" (OuterVolumeSpecName: "config-data") pod "31a21704-ce31-4b11-aa96-887a4c5f5cd7" (UID: "31a21704-ce31-4b11-aa96-887a4c5f5cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.343406 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.343446 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a21704-ce31-4b11-aa96-887a4c5f5cd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.343459 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2pbl\" (UniqueName: \"kubernetes.io/projected/31a21704-ce31-4b11-aa96-887a4c5f5cd7-kube-api-access-b2pbl\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.862688 4739 generic.go:334] "Generic (PLEG): container finished" podID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerID="d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d" exitCode=0 Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.862794 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerDied","Data":"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d"} Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.864870 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"31a21704-ce31-4b11-aa96-887a4c5f5cd7","Type":"ContainerDied","Data":"58e69bfb722cb8bc15abd5819cb3b2d280b4a0b5815801987a7ff940e9595d72"} Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.864928 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.864954 4739 scope.go:117] "RemoveContainer" containerID="f83b40f773ac326aeca098563ba9d11868cb884d5400a920ca82927a65173646" Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.905447 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.905888 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-central-agent" containerID="cri-o://13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab" gracePeriod=30 Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.906848 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="proxy-httpd" containerID="cri-o://68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5" gracePeriod=30 Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.906932 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="sg-core" containerID="cri-o://475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432" gracePeriod=30 Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.906980 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-notification-agent" containerID="cri-o://cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2" gracePeriod=30 Oct 08 22:11:02 crc kubenswrapper[4739]: I1008 22:11:02.923741 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": EOF" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.037032 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.131458 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.150413 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.150737 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-log" containerID="cri-o://c0ec13227ae3cffe87add033c773462938f8983b35641a6b35765629b85eb641" gracePeriod=30 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.151394 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-api" containerID="cri-o://2a07512189b575bf66e9bee1b24cdbc66140e6ceb36a9dd121f9496cd20033f6" gracePeriod=30 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.163585 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:11:03 crc kubenswrapper[4739]: E1008 22:11:03.164197 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.164220 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.164412 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.165376 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.168394 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.168651 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.168802 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.186474 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.266625 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.266724 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh9wz\" (UniqueName: \"kubernetes.io/projected/b2183def-3ac0-434f-bca8-dfd66210d7ab-kube-api-access-qh9wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.266868 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.267060 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.267088 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.369504 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.369742 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh9wz\" (UniqueName: \"kubernetes.io/projected/b2183def-3ac0-434f-bca8-dfd66210d7ab-kube-api-access-qh9wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.369835 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.369941 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.370032 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.374531 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.374774 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.375374 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.376250 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2183def-3ac0-434f-bca8-dfd66210d7ab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.388636 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh9wz\" (UniqueName: \"kubernetes.io/projected/b2183def-3ac0-434f-bca8-dfd66210d7ab-kube-api-access-qh9wz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2183def-3ac0-434f-bca8-dfd66210d7ab\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.523517 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.835293 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a21704-ce31-4b11-aa96-887a4c5f5cd7" path="/var/lib/kubelet/pods/31a21704-ce31-4b11-aa96-887a4c5f5cd7/volumes" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.880029 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerStarted","Data":"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869"} Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.880109 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.885090 4739 generic.go:334] "Generic (PLEG): container finished" podID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerID="c0ec13227ae3cffe87add033c773462938f8983b35641a6b35765629b85eb641" exitCode=143 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.885189 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerDied","Data":"c0ec13227ae3cffe87add033c773462938f8983b35641a6b35765629b85eb641"} Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888401 4739 generic.go:334] "Generic (PLEG): container finished" podID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerID="68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5" exitCode=0 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888421 4739 generic.go:334] "Generic (PLEG): container finished" podID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerID="475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432" exitCode=2 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888431 4739 generic.go:334] "Generic (PLEG): container finished" podID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerID="13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab" exitCode=0 Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888448 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerDied","Data":"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5"} Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888465 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerDied","Data":"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432"} Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.888479 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerDied","Data":"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab"} Oct 08 22:11:03 crc kubenswrapper[4739]: I1008 22:11:03.902442 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" podStartSLOduration=3.902412842 podStartE2EDuration="3.902412842s" podCreationTimestamp="2025-10-08 22:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:03.899801949 +0000 UTC m=+1363.725187709" watchObservedRunningTime="2025-10-08 22:11:03.902412842 +0000 UTC m=+1363.727798592" Oct 08 22:11:04 crc kubenswrapper[4739]: I1008 22:11:04.036518 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:11:04 crc kubenswrapper[4739]: I1008 22:11:04.899653 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b2183def-3ac0-434f-bca8-dfd66210d7ab","Type":"ContainerStarted","Data":"20a7a03c8e9947e083118fb80c7abbe673dba2be587e8a47fe656929173a2f5d"} Oct 08 22:11:04 crc kubenswrapper[4739]: I1008 22:11:04.900425 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b2183def-3ac0-434f-bca8-dfd66210d7ab","Type":"ContainerStarted","Data":"ac2da5e4a1d0dc83452c0fa9ac93f5173fb27e04ed95ba8e7b3ca52c95bd57f5"} Oct 08 22:11:04 crc kubenswrapper[4739]: I1008 22:11:04.931541 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.931524874 podStartE2EDuration="1.931524874s" podCreationTimestamp="2025-10-08 22:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:04.928053259 +0000 UTC m=+1364.753439009" watchObservedRunningTime="2025-10-08 22:11:04.931524874 +0000 UTC m=+1364.756910624" Oct 08 22:11:06 crc kubenswrapper[4739]: I1008 22:11:06.921582 4739 generic.go:334] "Generic (PLEG): container finished" podID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerID="2a07512189b575bf66e9bee1b24cdbc66140e6ceb36a9dd121f9496cd20033f6" exitCode=0 Oct 08 22:11:06 crc kubenswrapper[4739]: I1008 22:11:06.921690 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerDied","Data":"2a07512189b575bf66e9bee1b24cdbc66140e6ceb36a9dd121f9496cd20033f6"} Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.501073 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.592227 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jlgq\" (UniqueName: \"kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq\") pod \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.592309 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs\") pod \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.592388 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data\") pod \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.592442 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle\") pod \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\" (UID: \"5a4cb638-ca1c-4557-af16-4c60dd76ec39\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.593023 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs" (OuterVolumeSpecName: "logs") pod "5a4cb638-ca1c-4557-af16-4c60dd76ec39" (UID: "5a4cb638-ca1c-4557-af16-4c60dd76ec39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.626446 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq" (OuterVolumeSpecName: "kube-api-access-8jlgq") pod "5a4cb638-ca1c-4557-af16-4c60dd76ec39" (UID: "5a4cb638-ca1c-4557-af16-4c60dd76ec39"). InnerVolumeSpecName "kube-api-access-8jlgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.647039 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a4cb638-ca1c-4557-af16-4c60dd76ec39" (UID: "5a4cb638-ca1c-4557-af16-4c60dd76ec39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.647308 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data" (OuterVolumeSpecName: "config-data") pod "5a4cb638-ca1c-4557-af16-4c60dd76ec39" (UID: "5a4cb638-ca1c-4557-af16-4c60dd76ec39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.694703 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.694736 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jlgq\" (UniqueName: \"kubernetes.io/projected/5a4cb638-ca1c-4557-af16-4c60dd76ec39-kube-api-access-8jlgq\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.694748 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb638-ca1c-4557-af16-4c60dd76ec39-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.694756 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4cb638-ca1c-4557-af16-4c60dd76ec39-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.737606 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897431 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsjd\" (UniqueName: \"kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897622 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897659 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897708 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897736 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897791 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897828 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.897854 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data\") pod \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\" (UID: \"5849ea87-072a-4d43-8ee2-cb7fbce966bb\") " Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.898522 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.898757 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.902615 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts" (OuterVolumeSpecName: "scripts") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.902685 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd" (OuterVolumeSpecName: "kube-api-access-xbsjd") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "kube-api-access-xbsjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.936084 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a4cb638-ca1c-4557-af16-4c60dd76ec39","Type":"ContainerDied","Data":"51afdcf0a2e1e1e7f71537380cfa736c12e19d8e66ab259f5ef564c0a8f1f0f2"} Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.936101 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.936263 4739 scope.go:117] "RemoveContainer" containerID="2a07512189b575bf66e9bee1b24cdbc66140e6ceb36a9dd121f9496cd20033f6" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.945501 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.959568 4739 generic.go:334] "Generic (PLEG): container finished" podID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerID="cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2" exitCode=0 Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.959654 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.959699 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerDied","Data":"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2"} Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.960276 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5849ea87-072a-4d43-8ee2-cb7fbce966bb","Type":"ContainerDied","Data":"fb68cb0d0af06f8e9a7bf48d4da51be0dbd9be51d90739c9b52c791b2b844d6c"} Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.980515 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.981518 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:07 crc kubenswrapper[4739]: I1008 22:11:07.982730 4739 scope.go:117] "RemoveContainer" containerID="c0ec13227ae3cffe87add033c773462938f8983b35641a6b35765629b85eb641" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:07.999472 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000600 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsjd\" (UniqueName: \"kubernetes.io/projected/5849ea87-072a-4d43-8ee2-cb7fbce966bb-kube-api-access-xbsjd\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000628 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000637 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000645 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000653 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.000661 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5849ea87-072a-4d43-8ee2-cb7fbce966bb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.005258 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012006 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012429 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-log" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012441 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-log" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012453 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-central-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012459 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-central-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012467 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-api" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012473 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-api" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012489 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="proxy-httpd" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012495 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="proxy-httpd" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012509 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="sg-core" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012515 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="sg-core" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.012528 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-notification-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012534 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-notification-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012714 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="sg-core" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012746 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="proxy-httpd" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012759 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-notification-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012772 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-api" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012787 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" containerName="nova-api-log" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.012804 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" containerName="ceilometer-central-agent" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.013787 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.019103 4739 scope.go:117] "RemoveContainer" containerID="68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.021338 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.021520 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.021630 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.024519 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.050184 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data" (OuterVolumeSpecName: "config-data") pod "5849ea87-072a-4d43-8ee2-cb7fbce966bb" (UID: "5849ea87-072a-4d43-8ee2-cb7fbce966bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.062667 4739 scope.go:117] "RemoveContainer" containerID="475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.080089 4739 scope.go:117] "RemoveContainer" containerID="cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.099520 4739 scope.go:117] "RemoveContainer" containerID="13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.101850 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.101895 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.101982 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.102012 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.102050 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.102067 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29sv6\" (UniqueName: \"kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.102111 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.102125 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849ea87-072a-4d43-8ee2-cb7fbce966bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.119667 4739 scope.go:117] "RemoveContainer" containerID="68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.120061 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5\": container with ID starting with 68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5 not found: ID does not exist" containerID="68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120099 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5"} err="failed to get container status \"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5\": rpc error: code = NotFound desc = could not find container \"68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5\": container with ID starting with 68afcc1bc4d8ca2d21cc072518c06475847bfc9f7696746f57a524722064abf5 not found: ID does not exist" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120124 4739 scope.go:117] "RemoveContainer" containerID="475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.120451 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432\": container with ID starting with 475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432 not found: ID does not exist" containerID="475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120486 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432"} err="failed to get container status \"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432\": rpc error: code = NotFound desc = could not find container \"475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432\": container with ID starting with 475c04ff831a0a230801178f8e07cefb8c7ab9c1b69ea07bc1f0634c9daae432 not found: ID does not exist" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120501 4739 scope.go:117] "RemoveContainer" containerID="cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.120837 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2\": container with ID starting with cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2 not found: ID does not exist" containerID="cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120855 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2"} err="failed to get container status \"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2\": rpc error: code = NotFound desc = could not find container \"cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2\": container with ID starting with cbe673b2d29de2927e4015c412f16370938b8b05ea18c95dd4a104c734f206f2 not found: ID does not exist" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.120869 4739 scope.go:117] "RemoveContainer" containerID="13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab" Oct 08 22:11:08 crc kubenswrapper[4739]: E1008 22:11:08.121160 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab\": container with ID starting with 13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab not found: ID does not exist" containerID="13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.121194 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab"} err="failed to get container status \"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab\": rpc error: code = NotFound desc = could not find container \"13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab\": container with ID starting with 13d4d8e7948db1530acf6066ebb775336736f59bdd35196210a92a3114c096ab not found: ID does not exist" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203382 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203419 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29sv6\" (UniqueName: \"kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203461 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203500 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203634 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203666 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.203908 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.208487 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.209051 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.209247 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.210700 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.220941 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29sv6\" (UniqueName: \"kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6\") pod \"nova-api-0\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.299238 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.309421 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.323549 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.326061 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.327958 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.328520 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.330444 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.339291 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.340918 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414476 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414574 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vphtz\" (UniqueName: \"kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414608 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414646 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414771 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414802 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414845 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.414887 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521037 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521167 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521226 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vphtz\" (UniqueName: \"kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521258 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521301 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521503 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521545 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.521609 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.522330 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.523966 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.524380 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.525994 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.526659 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.526951 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.528533 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.530551 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.548633 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vphtz\" (UniqueName: \"kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz\") pod \"ceilometer-0\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " pod="openstack/ceilometer-0" Oct 08 22:11:08 crc kubenswrapper[4739]: I1008 22:11:08.687249 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:11:09 crc kubenswrapper[4739]: I1008 22:11:09.187060 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:09 crc kubenswrapper[4739]: W1008 22:11:09.189580 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39281e0b_f808_4cc6_a03e_15b460dab672.slice/crio-dba4b9e07b45705677005c0dea6c8554496775435c2c15bc69630fa224f84d8f WatchSource:0}: Error finding container dba4b9e07b45705677005c0dea6c8554496775435c2c15bc69630fa224f84d8f: Status 404 returned error can't find the container with id dba4b9e07b45705677005c0dea6c8554496775435c2c15bc69630fa224f84d8f Oct 08 22:11:09 crc kubenswrapper[4739]: I1008 22:11:09.326277 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:11:09 crc kubenswrapper[4739]: W1008 22:11:09.332358 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77a1b291_9a2d_4bec_8a71_8b021a2719ad.slice/crio-122a037400d13b224004f90e9e22ca342d89b0b167af224de649fa6fc9bc0e7f WatchSource:0}: Error finding container 122a037400d13b224004f90e9e22ca342d89b0b167af224de649fa6fc9bc0e7f: Status 404 returned error can't find the container with id 122a037400d13b224004f90e9e22ca342d89b0b167af224de649fa6fc9bc0e7f Oct 08 22:11:09 crc kubenswrapper[4739]: I1008 22:11:09.845274 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5849ea87-072a-4d43-8ee2-cb7fbce966bb" path="/var/lib/kubelet/pods/5849ea87-072a-4d43-8ee2-cb7fbce966bb/volumes" Oct 08 22:11:09 crc kubenswrapper[4739]: I1008 22:11:09.849263 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4cb638-ca1c-4557-af16-4c60dd76ec39" path="/var/lib/kubelet/pods/5a4cb638-ca1c-4557-af16-4c60dd76ec39/volumes" Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.004005 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerStarted","Data":"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181"} Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.004084 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerStarted","Data":"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb"} Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.004100 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerStarted","Data":"dba4b9e07b45705677005c0dea6c8554496775435c2c15bc69630fa224f84d8f"} Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.007980 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerStarted","Data":"122a037400d13b224004f90e9e22ca342d89b0b167af224de649fa6fc9bc0e7f"} Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.029408 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.029386995 podStartE2EDuration="3.029386995s" podCreationTimestamp="2025-10-08 22:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:10.023844639 +0000 UTC m=+1369.849230389" watchObservedRunningTime="2025-10-08 22:11:10.029386995 +0000 UTC m=+1369.854772745" Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.815449 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.906222 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:11:10 crc kubenswrapper[4739]: I1008 22:11:10.906518 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="dnsmasq-dns" containerID="cri-o://f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa" gracePeriod=10 Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.021789 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerStarted","Data":"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8"} Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.389936 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.516085 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.516590 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.516766 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h87lk\" (UniqueName: \"kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.516895 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.516979 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.517059 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb\") pod \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\" (UID: \"6043ead9-7b57-4d77-a1f6-71c7450bb6e6\") " Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.526459 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk" (OuterVolumeSpecName: "kube-api-access-h87lk") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "kube-api-access-h87lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.589904 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.590601 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.594696 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config" (OuterVolumeSpecName: "config") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.611427 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.619329 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.619424 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.619486 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.619544 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.619600 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h87lk\" (UniqueName: \"kubernetes.io/projected/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-kube-api-access-h87lk\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.620069 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6043ead9-7b57-4d77-a1f6-71c7450bb6e6" (UID: "6043ead9-7b57-4d77-a1f6-71c7450bb6e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:11:11 crc kubenswrapper[4739]: I1008 22:11:11.721574 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6043ead9-7b57-4d77-a1f6-71c7450bb6e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.037305 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerStarted","Data":"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf"} Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.043032 4739 generic.go:334] "Generic (PLEG): container finished" podID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerID="f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa" exitCode=0 Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.043099 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" event={"ID":"6043ead9-7b57-4d77-a1f6-71c7450bb6e6","Type":"ContainerDied","Data":"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa"} Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.043131 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" event={"ID":"6043ead9-7b57-4d77-a1f6-71c7450bb6e6","Type":"ContainerDied","Data":"600bc4e5023de9321d93b6440b031366a8fe2138469c4b6934329af9ccbe6b0f"} Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.043165 4739 scope.go:117] "RemoveContainer" containerID="f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.043107 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-9lc2r" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.082425 4739 scope.go:117] "RemoveContainer" containerID="b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.082616 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.091094 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-9lc2r"] Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.110534 4739 scope.go:117] "RemoveContainer" containerID="f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa" Oct 08 22:11:12 crc kubenswrapper[4739]: E1008 22:11:12.110987 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa\": container with ID starting with f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa not found: ID does not exist" containerID="f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.111034 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa"} err="failed to get container status \"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa\": rpc error: code = NotFound desc = could not find container \"f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa\": container with ID starting with f0666cfd80ec906f9b4419c0aceebf4dae2bd89ac6780f4fb59d69aa316ac9aa not found: ID does not exist" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.111067 4739 scope.go:117] "RemoveContainer" containerID="b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e" Oct 08 22:11:12 crc kubenswrapper[4739]: E1008 22:11:12.111733 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e\": container with ID starting with b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e not found: ID does not exist" containerID="b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e" Oct 08 22:11:12 crc kubenswrapper[4739]: I1008 22:11:12.111769 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e"} err="failed to get container status \"b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e\": rpc error: code = NotFound desc = could not find container \"b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e\": container with ID starting with b5b2ed0741c521a5f00e33b86613889e405e1446570cb2be904ad81e9361981e not found: ID does not exist" Oct 08 22:11:13 crc kubenswrapper[4739]: I1008 22:11:13.056637 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerStarted","Data":"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f"} Oct 08 22:11:13 crc kubenswrapper[4739]: I1008 22:11:13.524297 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:13 crc kubenswrapper[4739]: I1008 22:11:13.565203 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:13 crc kubenswrapper[4739]: I1008 22:11:13.833566 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" path="/var/lib/kubelet/pods/6043ead9-7b57-4d77-a1f6-71c7450bb6e6/volumes" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.095084 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.298642 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fjj6g"] Oct 08 22:11:14 crc kubenswrapper[4739]: E1008 22:11:14.299060 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="init" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.299077 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="init" Oct 08 22:11:14 crc kubenswrapper[4739]: E1008 22:11:14.299104 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="dnsmasq-dns" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.299111 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="dnsmasq-dns" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.299335 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="6043ead9-7b57-4d77-a1f6-71c7450bb6e6" containerName="dnsmasq-dns" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.299978 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.302756 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.304670 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.314133 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fjj6g"] Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.484929 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.484998 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.485106 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.485223 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dz29\" (UniqueName: \"kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.586556 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.587064 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dz29\" (UniqueName: \"kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.587092 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.587127 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.592303 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.593393 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.603860 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.607555 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dz29\" (UniqueName: \"kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29\") pod \"nova-cell1-cell-mapping-fjj6g\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:14 crc kubenswrapper[4739]: I1008 22:11:14.634885 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:15 crc kubenswrapper[4739]: I1008 22:11:15.083388 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerStarted","Data":"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2"} Oct 08 22:11:15 crc kubenswrapper[4739]: I1008 22:11:15.084005 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:11:15 crc kubenswrapper[4739]: I1008 22:11:15.124187 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.090737716 podStartE2EDuration="7.12412339s" podCreationTimestamp="2025-10-08 22:11:08 +0000 UTC" firstStartedPulling="2025-10-08 22:11:09.338524284 +0000 UTC m=+1369.163910054" lastFinishedPulling="2025-10-08 22:11:14.371909978 +0000 UTC m=+1374.197295728" observedRunningTime="2025-10-08 22:11:15.115494458 +0000 UTC m=+1374.940880238" watchObservedRunningTime="2025-10-08 22:11:15.12412339 +0000 UTC m=+1374.949509150" Oct 08 22:11:15 crc kubenswrapper[4739]: I1008 22:11:15.184069 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fjj6g"] Oct 08 22:11:16 crc kubenswrapper[4739]: I1008 22:11:16.104250 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fjj6g" event={"ID":"e27fb5e7-8d1e-4afb-8a82-86b4619bf330","Type":"ContainerStarted","Data":"b10729a7a20b72d90fb78c97dba4e910de8cb486ab4fbe792405c20d9906207c"} Oct 08 22:11:16 crc kubenswrapper[4739]: I1008 22:11:16.105459 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fjj6g" event={"ID":"e27fb5e7-8d1e-4afb-8a82-86b4619bf330","Type":"ContainerStarted","Data":"92d3fbf439b11d176cb00c24500b40b3fd8f2a25a992927c5d21f37399c9d28b"} Oct 08 22:11:16 crc kubenswrapper[4739]: I1008 22:11:16.138180 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fjj6g" podStartSLOduration=2.138131768 podStartE2EDuration="2.138131768s" podCreationTimestamp="2025-10-08 22:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:16.126485681 +0000 UTC m=+1375.951871461" watchObservedRunningTime="2025-10-08 22:11:16.138131768 +0000 UTC m=+1375.963517518" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.701315 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.703298 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.708979 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.787020 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.787479 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.787528 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pslt\" (UniqueName: \"kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.890307 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.890778 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.890975 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pslt\" (UniqueName: \"kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.891190 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.891692 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:17 crc kubenswrapper[4739]: I1008 22:11:17.917529 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pslt\" (UniqueName: \"kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt\") pod \"redhat-operators-qjhd4\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:18 crc kubenswrapper[4739]: I1008 22:11:18.024385 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:18 crc kubenswrapper[4739]: I1008 22:11:18.340299 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:11:18 crc kubenswrapper[4739]: I1008 22:11:18.340722 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:11:18 crc kubenswrapper[4739]: W1008 22:11:18.498200 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2117cdda_2200_4b19_8675_ea827df2f6d0.slice/crio-771aa68566456cbbaea674fe4099eb54631d0dded3fb5d603d8e5a0fad593c29 WatchSource:0}: Error finding container 771aa68566456cbbaea674fe4099eb54631d0dded3fb5d603d8e5a0fad593c29: Status 404 returned error can't find the container with id 771aa68566456cbbaea674fe4099eb54631d0dded3fb5d603d8e5a0fad593c29 Oct 08 22:11:18 crc kubenswrapper[4739]: I1008 22:11:18.509612 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:19 crc kubenswrapper[4739]: I1008 22:11:19.152645 4739 generic.go:334] "Generic (PLEG): container finished" podID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerID="73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc" exitCode=0 Oct 08 22:11:19 crc kubenswrapper[4739]: I1008 22:11:19.152871 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerDied","Data":"73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc"} Oct 08 22:11:19 crc kubenswrapper[4739]: I1008 22:11:19.153178 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerStarted","Data":"771aa68566456cbbaea674fe4099eb54631d0dded3fb5d603d8e5a0fad593c29"} Oct 08 22:11:19 crc kubenswrapper[4739]: I1008 22:11:19.359322 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:19 crc kubenswrapper[4739]: I1008 22:11:19.359676 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:21 crc kubenswrapper[4739]: I1008 22:11:21.182110 4739 generic.go:334] "Generic (PLEG): container finished" podID="e27fb5e7-8d1e-4afb-8a82-86b4619bf330" containerID="b10729a7a20b72d90fb78c97dba4e910de8cb486ab4fbe792405c20d9906207c" exitCode=0 Oct 08 22:11:21 crc kubenswrapper[4739]: I1008 22:11:21.182177 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fjj6g" event={"ID":"e27fb5e7-8d1e-4afb-8a82-86b4619bf330","Type":"ContainerDied","Data":"b10729a7a20b72d90fb78c97dba4e910de8cb486ab4fbe792405c20d9906207c"} Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.206057 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerStarted","Data":"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156"} Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.782844 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.920044 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data\") pod \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.920122 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dz29\" (UniqueName: \"kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29\") pod \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.920169 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts\") pod \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.920282 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle\") pod \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\" (UID: \"e27fb5e7-8d1e-4afb-8a82-86b4619bf330\") " Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.929116 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts" (OuterVolumeSpecName: "scripts") pod "e27fb5e7-8d1e-4afb-8a82-86b4619bf330" (UID: "e27fb5e7-8d1e-4afb-8a82-86b4619bf330"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.943537 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29" (OuterVolumeSpecName: "kube-api-access-7dz29") pod "e27fb5e7-8d1e-4afb-8a82-86b4619bf330" (UID: "e27fb5e7-8d1e-4afb-8a82-86b4619bf330"). InnerVolumeSpecName "kube-api-access-7dz29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.953089 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data" (OuterVolumeSpecName: "config-data") pod "e27fb5e7-8d1e-4afb-8a82-86b4619bf330" (UID: "e27fb5e7-8d1e-4afb-8a82-86b4619bf330"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:22 crc kubenswrapper[4739]: I1008 22:11:22.959475 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e27fb5e7-8d1e-4afb-8a82-86b4619bf330" (UID: "e27fb5e7-8d1e-4afb-8a82-86b4619bf330"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.022993 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.023029 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.023040 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dz29\" (UniqueName: \"kubernetes.io/projected/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-kube-api-access-7dz29\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.023052 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27fb5e7-8d1e-4afb-8a82-86b4619bf330-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.220693 4739 generic.go:334] "Generic (PLEG): container finished" podID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerID="86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156" exitCode=0 Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.220822 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerDied","Data":"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156"} Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.227891 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fjj6g" event={"ID":"e27fb5e7-8d1e-4afb-8a82-86b4619bf330","Type":"ContainerDied","Data":"92d3fbf439b11d176cb00c24500b40b3fd8f2a25a992927c5d21f37399c9d28b"} Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.227936 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fjj6g" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.227965 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d3fbf439b11d176cb00c24500b40b3fd8f2a25a992927c5d21f37399c9d28b" Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.401045 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.401495 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" containerID="cri-o://635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" gracePeriod=30 Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.414785 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.415084 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-log" containerID="cri-o://f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb" gracePeriod=30 Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.415260 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-api" containerID="cri-o://829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181" gracePeriod=30 Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.441895 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.442216 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" containerID="cri-o://da07b2427aa2cc09b0011d49395fcc427dbc782ca4cfcd25a0ea77952286f3b8" gracePeriod=30 Oct 08 22:11:23 crc kubenswrapper[4739]: I1008 22:11:23.442301 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" containerID="cri-o://6a4220f817984c8e7cccd3d58a15d7e4884187b7ae3d2eb44430bb7d142a196d" gracePeriod=30 Oct 08 22:11:24 crc kubenswrapper[4739]: E1008 22:11:24.165537 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:24 crc kubenswrapper[4739]: E1008 22:11:24.168215 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:24 crc kubenswrapper[4739]: E1008 22:11:24.169694 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:24 crc kubenswrapper[4739]: E1008 22:11:24.169781 4739 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.239490 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerID="da07b2427aa2cc09b0011d49395fcc427dbc782ca4cfcd25a0ea77952286f3b8" exitCode=143 Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.239574 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerDied","Data":"da07b2427aa2cc09b0011d49395fcc427dbc782ca4cfcd25a0ea77952286f3b8"} Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.242650 4739 generic.go:334] "Generic (PLEG): container finished" podID="39281e0b-f808-4cc6-a03e-15b460dab672" containerID="f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb" exitCode=143 Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.242684 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerDied","Data":"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb"} Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.245500 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerStarted","Data":"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b"} Oct 08 22:11:24 crc kubenswrapper[4739]: I1008 22:11:24.275473 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qjhd4" podStartSLOduration=2.686100255 podStartE2EDuration="7.275446125s" podCreationTimestamp="2025-10-08 22:11:17 +0000 UTC" firstStartedPulling="2025-10-08 22:11:19.156175076 +0000 UTC m=+1378.981560826" lastFinishedPulling="2025-10-08 22:11:23.745520946 +0000 UTC m=+1383.570906696" observedRunningTime="2025-10-08 22:11:24.264015863 +0000 UTC m=+1384.089401613" watchObservedRunningTime="2025-10-08 22:11:24.275446125 +0000 UTC m=+1384.100831875" Oct 08 22:11:26 crc kubenswrapper[4739]: I1008 22:11:26.590756 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:35856->10.217.0.197:8775: read: connection reset by peer" Oct 08 22:11:26 crc kubenswrapper[4739]: I1008 22:11:26.590760 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:35844->10.217.0.197:8775: read: connection reset by peer" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.081062 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216387 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216441 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216485 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216533 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216585 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216634 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29sv6\" (UniqueName: \"kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6\") pod \"39281e0b-f808-4cc6-a03e-15b460dab672\" (UID: \"39281e0b-f808-4cc6-a03e-15b460dab672\") " Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.216898 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs" (OuterVolumeSpecName: "logs") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.217517 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39281e0b-f808-4cc6-a03e-15b460dab672-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.221903 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6" (OuterVolumeSpecName: "kube-api-access-29sv6") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "kube-api-access-29sv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.243630 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.243721 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data" (OuterVolumeSpecName: "config-data") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.266567 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.277995 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "39281e0b-f808-4cc6-a03e-15b460dab672" (UID: "39281e0b-f808-4cc6-a03e-15b460dab672"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.302686 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerID="6a4220f817984c8e7cccd3d58a15d7e4884187b7ae3d2eb44430bb7d142a196d" exitCode=0 Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.302740 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerDied","Data":"6a4220f817984c8e7cccd3d58a15d7e4884187b7ae3d2eb44430bb7d142a196d"} Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.305068 4739 generic.go:334] "Generic (PLEG): container finished" podID="39281e0b-f808-4cc6-a03e-15b460dab672" containerID="829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181" exitCode=0 Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.305123 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerDied","Data":"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181"} Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.305177 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"39281e0b-f808-4cc6-a03e-15b460dab672","Type":"ContainerDied","Data":"dba4b9e07b45705677005c0dea6c8554496775435c2c15bc69630fa224f84d8f"} Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.305201 4739 scope.go:117] "RemoveContainer" containerID="829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.305365 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.318837 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.318863 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29sv6\" (UniqueName: \"kubernetes.io/projected/39281e0b-f808-4cc6-a03e-15b460dab672-kube-api-access-29sv6\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.318876 4739 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.318886 4739 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.318894 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39281e0b-f808-4cc6-a03e-15b460dab672-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.337390 4739 scope.go:117] "RemoveContainer" containerID="f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.343465 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.355385 4739 scope.go:117] "RemoveContainer" containerID="829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181" Oct 08 22:11:27 crc kubenswrapper[4739]: E1008 22:11:27.355966 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181\": container with ID starting with 829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181 not found: ID does not exist" containerID="829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.356006 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181"} err="failed to get container status \"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181\": rpc error: code = NotFound desc = could not find container \"829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181\": container with ID starting with 829df51a20eb99b246ca307de82f7381fc41964ee77dcce438c798ee1a7b6181 not found: ID does not exist" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.356033 4739 scope.go:117] "RemoveContainer" containerID="f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb" Oct 08 22:11:27 crc kubenswrapper[4739]: E1008 22:11:27.356472 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb\": container with ID starting with f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb not found: ID does not exist" containerID="f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.356512 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb"} err="failed to get container status \"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb\": rpc error: code = NotFound desc = could not find container \"f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb\": container with ID starting with f191236bf084091cc6aee19785017d226c515f57bad1673397a00337c6bb7dfb not found: ID does not exist" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.371769 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.385972 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:27 crc kubenswrapper[4739]: E1008 22:11:27.386658 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27fb5e7-8d1e-4afb-8a82-86b4619bf330" containerName="nova-manage" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.386677 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27fb5e7-8d1e-4afb-8a82-86b4619bf330" containerName="nova-manage" Oct 08 22:11:27 crc kubenswrapper[4739]: E1008 22:11:27.386704 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-log" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.386712 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-log" Oct 08 22:11:27 crc kubenswrapper[4739]: E1008 22:11:27.386727 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-api" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.386734 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-api" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.387030 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-log" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.387060 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" containerName="nova-api-api" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.387075 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27fb5e7-8d1e-4afb-8a82-86b4619bf330" containerName="nova-manage" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.388502 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.391044 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.391217 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.391254 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.400823 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420234 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-public-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420280 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420323 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8c46-856e-465c-bd60-fb13b76e5079-logs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420355 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8gxn\" (UniqueName: \"kubernetes.io/projected/16fa8c46-856e-465c-bd60-fb13b76e5079-kube-api-access-h8gxn\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420406 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-config-data\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.420461 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522432 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522534 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-public-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522575 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522621 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8c46-856e-465c-bd60-fb13b76e5079-logs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522647 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8gxn\" (UniqueName: \"kubernetes.io/projected/16fa8c46-856e-465c-bd60-fb13b76e5079-kube-api-access-h8gxn\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.522674 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-config-data\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.523313 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8c46-856e-465c-bd60-fb13b76e5079-logs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.527855 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-public-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.527863 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.529628 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-config-data\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.529827 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8c46-856e-465c-bd60-fb13b76e5079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.541208 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8gxn\" (UniqueName: \"kubernetes.io/projected/16fa8c46-856e-465c-bd60-fb13b76e5079-kube-api-access-h8gxn\") pod \"nova-api-0\" (UID: \"16fa8c46-856e-465c-bd60-fb13b76e5079\") " pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.716326 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:11:27 crc kubenswrapper[4739]: I1008 22:11:27.840084 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39281e0b-f808-4cc6-a03e-15b460dab672" path="/var/lib/kubelet/pods/39281e0b-f808-4cc6-a03e-15b460dab672/volumes" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.030612 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.030886 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.256063 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.319718 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad230d2e-c78a-4e94-96d4-a44f02da6033","Type":"ContainerDied","Data":"ad16e4c75a2b01918f5a49d9e6cd89f77d432dbcf68bf3de130382bc7a1f5637"} Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.319752 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.320282 4739 scope.go:117] "RemoveContainer" containerID="6a4220f817984c8e7cccd3d58a15d7e4884187b7ae3d2eb44430bb7d142a196d" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.363950 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.382102 4739 scope.go:117] "RemoveContainer" containerID="da07b2427aa2cc09b0011d49395fcc427dbc782ca4cfcd25a0ea77952286f3b8" Oct 08 22:11:28 crc kubenswrapper[4739]: W1008 22:11:28.386512 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fa8c46_856e_465c_bd60_fb13b76e5079.slice/crio-b07bc725bc8a6c51438554857f2740e420a219bca6d72b61d6a008c65b8f33fe WatchSource:0}: Error finding container b07bc725bc8a6c51438554857f2740e420a219bca6d72b61d6a008c65b8f33fe: Status 404 returned error can't find the container with id b07bc725bc8a6c51438554857f2740e420a219bca6d72b61d6a008c65b8f33fe Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.438708 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle\") pod \"ad230d2e-c78a-4e94-96d4-a44f02da6033\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.438846 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs\") pod \"ad230d2e-c78a-4e94-96d4-a44f02da6033\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.438896 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data\") pod \"ad230d2e-c78a-4e94-96d4-a44f02da6033\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.438988 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txf4c\" (UniqueName: \"kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c\") pod \"ad230d2e-c78a-4e94-96d4-a44f02da6033\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.439069 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs\") pod \"ad230d2e-c78a-4e94-96d4-a44f02da6033\" (UID: \"ad230d2e-c78a-4e94-96d4-a44f02da6033\") " Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.448119 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs" (OuterVolumeSpecName: "logs") pod "ad230d2e-c78a-4e94-96d4-a44f02da6033" (UID: "ad230d2e-c78a-4e94-96d4-a44f02da6033"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.453161 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c" (OuterVolumeSpecName: "kube-api-access-txf4c") pod "ad230d2e-c78a-4e94-96d4-a44f02da6033" (UID: "ad230d2e-c78a-4e94-96d4-a44f02da6033"). InnerVolumeSpecName "kube-api-access-txf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.473529 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad230d2e-c78a-4e94-96d4-a44f02da6033" (UID: "ad230d2e-c78a-4e94-96d4-a44f02da6033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.477754 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data" (OuterVolumeSpecName: "config-data") pod "ad230d2e-c78a-4e94-96d4-a44f02da6033" (UID: "ad230d2e-c78a-4e94-96d4-a44f02da6033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.513399 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ad230d2e-c78a-4e94-96d4-a44f02da6033" (UID: "ad230d2e-c78a-4e94-96d4-a44f02da6033"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.542413 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.542481 4739 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.542512 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad230d2e-c78a-4e94-96d4-a44f02da6033-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.542535 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txf4c\" (UniqueName: \"kubernetes.io/projected/ad230d2e-c78a-4e94-96d4-a44f02da6033-kube-api-access-txf4c\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.542556 4739 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad230d2e-c78a-4e94-96d4-a44f02da6033-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.665533 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.690035 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.708962 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:28 crc kubenswrapper[4739]: E1008 22:11:28.709422 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.709442 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" Oct 08 22:11:28 crc kubenswrapper[4739]: E1008 22:11:28.709475 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.709486 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.709723 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-metadata" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.709745 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" containerName="nova-metadata-log" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.710895 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.715320 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.716785 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.730728 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.746463 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.746651 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.746802 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-config-data\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.746846 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99220499-d612-49e9-a7f1-622280a12221-logs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.747006 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwq9\" (UniqueName: \"kubernetes.io/projected/99220499-d612-49e9-a7f1-622280a12221-kube-api-access-5zwq9\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.850358 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.851099 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.851164 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-config-data\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.851193 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99220499-d612-49e9-a7f1-622280a12221-logs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.851481 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwq9\" (UniqueName: \"kubernetes.io/projected/99220499-d612-49e9-a7f1-622280a12221-kube-api-access-5zwq9\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.852452 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99220499-d612-49e9-a7f1-622280a12221-logs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.856557 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.856879 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.860840 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99220499-d612-49e9-a7f1-622280a12221-config-data\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:28 crc kubenswrapper[4739]: I1008 22:11:28.869267 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwq9\" (UniqueName: \"kubernetes.io/projected/99220499-d612-49e9-a7f1-622280a12221-kube-api-access-5zwq9\") pod \"nova-metadata-0\" (UID: \"99220499-d612-49e9-a7f1-622280a12221\") " pod="openstack/nova-metadata-0" Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.036518 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.118423 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjhd4" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="registry-server" probeResult="failure" output=< Oct 08 22:11:29 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:11:29 crc kubenswrapper[4739]: > Oct 08 22:11:29 crc kubenswrapper[4739]: E1008 22:11:29.166487 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:29 crc kubenswrapper[4739]: E1008 22:11:29.168849 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:29 crc kubenswrapper[4739]: E1008 22:11:29.170292 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:11:29 crc kubenswrapper[4739]: E1008 22:11:29.170345 4739 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.332192 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"16fa8c46-856e-465c-bd60-fb13b76e5079","Type":"ContainerStarted","Data":"8ab63d27569cc32a7067f6f23cfd8138771cf09a283bc7b29fee5d5952a76720"} Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.332298 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"16fa8c46-856e-465c-bd60-fb13b76e5079","Type":"ContainerStarted","Data":"b07bc725bc8a6c51438554857f2740e420a219bca6d72b61d6a008c65b8f33fe"} Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.468099 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:11:29 crc kubenswrapper[4739]: I1008 22:11:29.832522 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad230d2e-c78a-4e94-96d4-a44f02da6033" path="/var/lib/kubelet/pods/ad230d2e-c78a-4e94-96d4-a44f02da6033/volumes" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.263728 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.280910 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data\") pod \"dd005429-201e-42f1-af67-f89e02d19b7a\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.281064 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle\") pod \"dd005429-201e-42f1-af67-f89e02d19b7a\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.281131 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvs8g\" (UniqueName: \"kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g\") pod \"dd005429-201e-42f1-af67-f89e02d19b7a\" (UID: \"dd005429-201e-42f1-af67-f89e02d19b7a\") " Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.295063 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g" (OuterVolumeSpecName: "kube-api-access-xvs8g") pod "dd005429-201e-42f1-af67-f89e02d19b7a" (UID: "dd005429-201e-42f1-af67-f89e02d19b7a"). InnerVolumeSpecName "kube-api-access-xvs8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.310957 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd005429-201e-42f1-af67-f89e02d19b7a" (UID: "dd005429-201e-42f1-af67-f89e02d19b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.312747 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data" (OuterVolumeSpecName: "config-data") pod "dd005429-201e-42f1-af67-f89e02d19b7a" (UID: "dd005429-201e-42f1-af67-f89e02d19b7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.344191 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"16fa8c46-856e-465c-bd60-fb13b76e5079","Type":"ContainerStarted","Data":"dd9989a36d53230653d967f80a986a20511f6face1155c9de42fd9ce79d64bdb"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.348791 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99220499-d612-49e9-a7f1-622280a12221","Type":"ContainerStarted","Data":"4021f4de6bf872bfa52268c48a457bb0d63426632df6a7145170af03976a52db"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.348840 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99220499-d612-49e9-a7f1-622280a12221","Type":"ContainerStarted","Data":"021dc1a9c8d247bcb674c2cd9c774efc0714b705ee8ffc6cbda3bbce72169a1a"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.348849 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99220499-d612-49e9-a7f1-622280a12221","Type":"ContainerStarted","Data":"29c35c677666b989ec8ccd733df20dfe176a2af6969aaba97231ea362054dff0"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.350417 4739 generic.go:334] "Generic (PLEG): container finished" podID="dd005429-201e-42f1-af67-f89e02d19b7a" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" exitCode=0 Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.350462 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd005429-201e-42f1-af67-f89e02d19b7a","Type":"ContainerDied","Data":"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.350484 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dd005429-201e-42f1-af67-f89e02d19b7a","Type":"ContainerDied","Data":"e3aa7f5198e866cf39a8089653b91a54ffc676682bd95804b4ac98e23ea121ac"} Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.350510 4739 scope.go:117] "RemoveContainer" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.350594 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.384094 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.384187 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvs8g\" (UniqueName: \"kubernetes.io/projected/dd005429-201e-42f1-af67-f89e02d19b7a-kube-api-access-xvs8g\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.384202 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd005429-201e-42f1-af67-f89e02d19b7a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.388521 4739 scope.go:117] "RemoveContainer" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" Oct 08 22:11:30 crc kubenswrapper[4739]: E1008 22:11:30.389118 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a\": container with ID starting with 635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a not found: ID does not exist" containerID="635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.389172 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a"} err="failed to get container status \"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a\": rpc error: code = NotFound desc = could not find container \"635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a\": container with ID starting with 635f276e6223d0427e1659708d677a6cdf600555bb2ff42ed4f8f4297ffb2c5a not found: ID does not exist" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.391846 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.391826906 podStartE2EDuration="3.391826906s" podCreationTimestamp="2025-10-08 22:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:30.368681187 +0000 UTC m=+1390.194066937" watchObservedRunningTime="2025-10-08 22:11:30.391826906 +0000 UTC m=+1390.217212646" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.393889 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.399502 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.423520 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:30 crc kubenswrapper[4739]: E1008 22:11:30.424238 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.424309 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.424572 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" containerName="nova-scheduler-scheduler" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.425471 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.427780 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.427757802 podStartE2EDuration="2.427757802s" podCreationTimestamp="2025-10-08 22:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:30.412421174 +0000 UTC m=+1390.237806944" watchObservedRunningTime="2025-10-08 22:11:30.427757802 +0000 UTC m=+1390.253143552" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.431623 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.490939 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wln8\" (UniqueName: \"kubernetes.io/projected/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-kube-api-access-9wln8\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.491352 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-config-data\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.491441 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.494843 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.597065 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wln8\" (UniqueName: \"kubernetes.io/projected/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-kube-api-access-9wln8\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.597221 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-config-data\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.597259 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.604103 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.604130 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-config-data\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.615481 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wln8\" (UniqueName: \"kubernetes.io/projected/08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3-kube-api-access-9wln8\") pod \"nova-scheduler-0\" (UID: \"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3\") " pod="openstack/nova-scheduler-0" Oct 08 22:11:30 crc kubenswrapper[4739]: I1008 22:11:30.742642 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:11:31 crc kubenswrapper[4739]: I1008 22:11:31.168888 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:11:31 crc kubenswrapper[4739]: W1008 22:11:31.175732 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b3472b_4c6f_4c1a_83b2_6c176dbbcbf3.slice/crio-e1ebec5f4073fc7f884033d656c320a1090c7aaaaf35ead8e975589f6181d79b WatchSource:0}: Error finding container e1ebec5f4073fc7f884033d656c320a1090c7aaaaf35ead8e975589f6181d79b: Status 404 returned error can't find the container with id e1ebec5f4073fc7f884033d656c320a1090c7aaaaf35ead8e975589f6181d79b Oct 08 22:11:31 crc kubenswrapper[4739]: I1008 22:11:31.364923 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3","Type":"ContainerStarted","Data":"e1ebec5f4073fc7f884033d656c320a1090c7aaaaf35ead8e975589f6181d79b"} Oct 08 22:11:31 crc kubenswrapper[4739]: I1008 22:11:31.838486 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd005429-201e-42f1-af67-f89e02d19b7a" path="/var/lib/kubelet/pods/dd005429-201e-42f1-af67-f89e02d19b7a/volumes" Oct 08 22:11:32 crc kubenswrapper[4739]: I1008 22:11:32.391138 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3","Type":"ContainerStarted","Data":"2fc417d0e6f69e49ed3099770d5ee987860fdafcee1273d8f9f6a32a24d2033a"} Oct 08 22:11:32 crc kubenswrapper[4739]: I1008 22:11:32.429647 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.429619506 podStartE2EDuration="2.429619506s" podCreationTimestamp="2025-10-08 22:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:11:32.417205151 +0000 UTC m=+1392.242590921" watchObservedRunningTime="2025-10-08 22:11:32.429619506 +0000 UTC m=+1392.255005296" Oct 08 22:11:34 crc kubenswrapper[4739]: I1008 22:11:34.037015 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:11:34 crc kubenswrapper[4739]: I1008 22:11:34.037610 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:11:35 crc kubenswrapper[4739]: I1008 22:11:35.742770 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:11:37 crc kubenswrapper[4739]: I1008 22:11:37.716916 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:11:37 crc kubenswrapper[4739]: I1008 22:11:37.717392 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:11:38 crc kubenswrapper[4739]: I1008 22:11:38.092668 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:38 crc kubenswrapper[4739]: I1008 22:11:38.181044 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:38 crc kubenswrapper[4739]: I1008 22:11:38.327795 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:38 crc kubenswrapper[4739]: I1008 22:11:38.731430 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="16fa8c46-856e-465c-bd60-fb13b76e5079" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:38 crc kubenswrapper[4739]: I1008 22:11:38.731422 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="16fa8c46-856e-465c-bd60-fb13b76e5079" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:39 crc kubenswrapper[4739]: I1008 22:11:39.037496 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:11:39 crc kubenswrapper[4739]: I1008 22:11:39.038679 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:11:39 crc kubenswrapper[4739]: I1008 22:11:39.448592 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:11:39 crc kubenswrapper[4739]: I1008 22:11:39.484451 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qjhd4" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="registry-server" containerID="cri-o://56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b" gracePeriod=2 Oct 08 22:11:39 crc kubenswrapper[4739]: I1008 22:11:39.948710 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.050435 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="99220499-d612-49e9-a7f1-622280a12221" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.050392 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="99220499-d612-49e9-a7f1-622280a12221" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.104345 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content\") pod \"2117cdda-2200-4b19-8675-ea827df2f6d0\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.104877 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pslt\" (UniqueName: \"kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt\") pod \"2117cdda-2200-4b19-8675-ea827df2f6d0\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.105304 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities\") pod \"2117cdda-2200-4b19-8675-ea827df2f6d0\" (UID: \"2117cdda-2200-4b19-8675-ea827df2f6d0\") " Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.106566 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities" (OuterVolumeSpecName: "utilities") pod "2117cdda-2200-4b19-8675-ea827df2f6d0" (UID: "2117cdda-2200-4b19-8675-ea827df2f6d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.127695 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt" (OuterVolumeSpecName: "kube-api-access-6pslt") pod "2117cdda-2200-4b19-8675-ea827df2f6d0" (UID: "2117cdda-2200-4b19-8675-ea827df2f6d0"). InnerVolumeSpecName "kube-api-access-6pslt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.208216 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pslt\" (UniqueName: \"kubernetes.io/projected/2117cdda-2200-4b19-8675-ea827df2f6d0-kube-api-access-6pslt\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.208262 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.220541 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2117cdda-2200-4b19-8675-ea827df2f6d0" (UID: "2117cdda-2200-4b19-8675-ea827df2f6d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.310162 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2117cdda-2200-4b19-8675-ea827df2f6d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.501884 4739 generic.go:334] "Generic (PLEG): container finished" podID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerID="56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b" exitCode=0 Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.501937 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerDied","Data":"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b"} Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.501989 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjhd4" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.502026 4739 scope.go:117] "RemoveContainer" containerID="56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.502006 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjhd4" event={"ID":"2117cdda-2200-4b19-8675-ea827df2f6d0","Type":"ContainerDied","Data":"771aa68566456cbbaea674fe4099eb54631d0dded3fb5d603d8e5a0fad593c29"} Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.542883 4739 scope.go:117] "RemoveContainer" containerID="86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.555133 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.564629 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qjhd4"] Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.578826 4739 scope.go:117] "RemoveContainer" containerID="73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.639104 4739 scope.go:117] "RemoveContainer" containerID="56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b" Oct 08 22:11:40 crc kubenswrapper[4739]: E1008 22:11:40.640882 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b\": container with ID starting with 56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b not found: ID does not exist" containerID="56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.640927 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b"} err="failed to get container status \"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b\": rpc error: code = NotFound desc = could not find container \"56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b\": container with ID starting with 56c4853968af27f6f9eb1f1ae5187b918c48d21aabad36e5ffe80a6eced1764b not found: ID does not exist" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.640980 4739 scope.go:117] "RemoveContainer" containerID="86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156" Oct 08 22:11:40 crc kubenswrapper[4739]: E1008 22:11:40.641675 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156\": container with ID starting with 86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156 not found: ID does not exist" containerID="86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.641817 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156"} err="failed to get container status \"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156\": rpc error: code = NotFound desc = could not find container \"86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156\": container with ID starting with 86d6312cd4f884f35eca199936d09d143bf11b4fe97d2a2bcc14d00a42605156 not found: ID does not exist" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.641914 4739 scope.go:117] "RemoveContainer" containerID="73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc" Oct 08 22:11:40 crc kubenswrapper[4739]: E1008 22:11:40.643915 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc\": container with ID starting with 73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc not found: ID does not exist" containerID="73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.644029 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc"} err="failed to get container status \"73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc\": rpc error: code = NotFound desc = could not find container \"73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc\": container with ID starting with 73a6f7a9962a626504122745685916ff7bc308bbd30b71f0668103efc9c804dc not found: ID does not exist" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.743758 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:11:40 crc kubenswrapper[4739]: I1008 22:11:40.791358 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:11:41 crc kubenswrapper[4739]: I1008 22:11:41.549782 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:11:41 crc kubenswrapper[4739]: I1008 22:11:41.841710 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" path="/var/lib/kubelet/pods/2117cdda-2200-4b19-8675-ea827df2f6d0/volumes" Oct 08 22:11:47 crc kubenswrapper[4739]: I1008 22:11:47.725990 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:11:47 crc kubenswrapper[4739]: I1008 22:11:47.728505 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:11:47 crc kubenswrapper[4739]: I1008 22:11:47.729204 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:11:47 crc kubenswrapper[4739]: I1008 22:11:47.739226 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:11:48 crc kubenswrapper[4739]: I1008 22:11:48.610795 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:11:48 crc kubenswrapper[4739]: I1008 22:11:48.626237 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:11:49 crc kubenswrapper[4739]: I1008 22:11:49.042663 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:11:49 crc kubenswrapper[4739]: I1008 22:11:49.046092 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:11:49 crc kubenswrapper[4739]: I1008 22:11:49.051435 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:11:49 crc kubenswrapper[4739]: I1008 22:11:49.629392 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:11:51 crc kubenswrapper[4739]: I1008 22:11:51.766965 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:11:51 crc kubenswrapper[4739]: I1008 22:11:51.767072 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:12:01 crc kubenswrapper[4739]: I1008 22:12:01.984582 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:01 crc kubenswrapper[4739]: I1008 22:12:01.986897 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="proxy-httpd" containerID="cri-o://c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2" gracePeriod=30 Oct 08 22:12:01 crc kubenswrapper[4739]: I1008 22:12:01.986948 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="sg-core" containerID="cri-o://4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f" gracePeriod=30 Oct 08 22:12:01 crc kubenswrapper[4739]: I1008 22:12:01.987046 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-notification-agent" containerID="cri-o://93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf" gracePeriod=30 Oct 08 22:12:01 crc kubenswrapper[4739]: I1008 22:12:01.986894 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-central-agent" containerID="cri-o://7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8" gracePeriod=30 Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782624 4739 generic.go:334] "Generic (PLEG): container finished" podID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerID="c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2" exitCode=0 Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782897 4739 generic.go:334] "Generic (PLEG): container finished" podID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerID="4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f" exitCode=2 Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782905 4739 generic.go:334] "Generic (PLEG): container finished" podID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerID="7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8" exitCode=0 Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782697 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerDied","Data":"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2"} Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782941 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerDied","Data":"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f"} Oct 08 22:12:02 crc kubenswrapper[4739]: I1008 22:12:02.782953 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerDied","Data":"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8"} Oct 08 22:12:03 crc kubenswrapper[4739]: I1008 22:12:03.389160 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.192275 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.702620 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.780880 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.780940 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.780959 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781382 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781577 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781723 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781747 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vphtz\" (UniqueName: \"kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781832 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.781903 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd\") pod \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\" (UID: \"77a1b291-9a2d-4bec-8a71-8b021a2719ad\") " Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.782342 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.782584 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.791261 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts" (OuterVolumeSpecName: "scripts") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.816980 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz" (OuterVolumeSpecName: "kube-api-access-vphtz") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "kube-api-access-vphtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.875657 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerDied","Data":"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf"} Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.875895 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.876375 4739 scope.go:117] "RemoveContainer" containerID="c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.876198 4739 generic.go:334] "Generic (PLEG): container finished" podID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerID="93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf" exitCode=0 Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.876572 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77a1b291-9a2d-4bec-8a71-8b021a2719ad","Type":"ContainerDied","Data":"122a037400d13b224004f90e9e22ca342d89b0b167af224de649fa6fc9bc0e7f"} Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.885812 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a1b291-9a2d-4bec-8a71-8b021a2719ad-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.885839 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.885850 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vphtz\" (UniqueName: \"kubernetes.io/projected/77a1b291-9a2d-4bec-8a71-8b021a2719ad-kube-api-access-vphtz\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.905614 4739 scope.go:117] "RemoveContainer" containerID="4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.927238 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.945652 4739 scope.go:117] "RemoveContainer" containerID="93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.948258 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.978165 4739 scope.go:117] "RemoveContainer" containerID="7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.987496 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.987525 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:04 crc kubenswrapper[4739]: I1008 22:12:04.992445 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data" (OuterVolumeSpecName: "config-data") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.007116 4739 scope.go:117] "RemoveContainer" containerID="c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.011533 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2\": container with ID starting with c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2 not found: ID does not exist" containerID="c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.011620 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2"} err="failed to get container status \"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2\": rpc error: code = NotFound desc = could not find container \"c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2\": container with ID starting with c8eb787ce3795dc1347ef5637e8f7e5548c60e8e923599bdd8ece170807c67d2 not found: ID does not exist" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.011695 4739 scope.go:117] "RemoveContainer" containerID="4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.012395 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77a1b291-9a2d-4bec-8a71-8b021a2719ad" (UID: "77a1b291-9a2d-4bec-8a71-8b021a2719ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.014354 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f\": container with ID starting with 4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f not found: ID does not exist" containerID="4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.014393 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f"} err="failed to get container status \"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f\": rpc error: code = NotFound desc = could not find container \"4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f\": container with ID starting with 4f52e5470a8177fdd316732017c286b81282bf8f4cde193f2c6c22070ff6fe0f not found: ID does not exist" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.014422 4739 scope.go:117] "RemoveContainer" containerID="93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.018511 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf\": container with ID starting with 93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf not found: ID does not exist" containerID="93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.018548 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf"} err="failed to get container status \"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf\": rpc error: code = NotFound desc = could not find container \"93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf\": container with ID starting with 93fc3e95e62cc2eac72d29f14ff97eb505fc2124c9c7212d0c708899723080bf not found: ID does not exist" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.018571 4739 scope.go:117] "RemoveContainer" containerID="7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.019734 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8\": container with ID starting with 7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8 not found: ID does not exist" containerID="7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.019762 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8"} err="failed to get container status \"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8\": rpc error: code = NotFound desc = could not find container \"7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8\": container with ID starting with 7ba16b1156d000cc2d3aecd24e58a2c9f13f09bdd67c278774486624e55a32d8 not found: ID does not exist" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.088770 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.088799 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a1b291-9a2d-4bec-8a71-8b021a2719ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.225801 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.237258 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275523 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.275924 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="extract-content" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275938 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="extract-content" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.275947 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="extract-utilities" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275953 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="extract-utilities" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.275967 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="proxy-httpd" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275973 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="proxy-httpd" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.275980 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="registry-server" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275985 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="registry-server" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.275993 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-central-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.275998 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-central-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.276012 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="sg-core" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276017 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="sg-core" Oct 08 22:12:05 crc kubenswrapper[4739]: E1008 22:12:05.276075 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-notification-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276083 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-notification-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276343 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="proxy-httpd" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276363 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-notification-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276394 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="ceilometer-central-agent" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276403 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" containerName="sg-core" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.276418 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2117cdda-2200-4b19-8675-ea827df2f6d0" containerName="registry-server" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.278516 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.284576 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.284829 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.285931 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.301955 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395175 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395236 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395270 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395435 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395488 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395549 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395571 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rfk7\" (UniqueName: \"kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.395673 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.497385 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498054 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498094 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498172 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498195 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498222 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498239 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rfk7\" (UniqueName: \"kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498292 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498708 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.498738 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.503045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.503101 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.504404 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.504773 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.510344 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.528743 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rfk7\" (UniqueName: \"kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7\") pod \"ceilometer-0\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.599250 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:12:05 crc kubenswrapper[4739]: I1008 22:12:05.833940 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a1b291-9a2d-4bec-8a71-8b021a2719ad" path="/var/lib/kubelet/pods/77a1b291-9a2d-4bec-8a71-8b021a2719ad/volumes" Oct 08 22:12:06 crc kubenswrapper[4739]: I1008 22:12:06.109767 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:12:06 crc kubenswrapper[4739]: I1008 22:12:06.903218 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerStarted","Data":"ba145cc426698f3cad2b4a6649887828549d10d6db001bb78465b4c1092870fe"} Oct 08 22:12:07 crc kubenswrapper[4739]: I1008 22:12:07.673378 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="rabbitmq" containerID="cri-o://ed3bef55901da71ee502980a09f6088f4a2b5887918d42628f5e09c4667fbdf3" gracePeriod=604796 Oct 08 22:12:08 crc kubenswrapper[4739]: I1008 22:12:08.902323 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="rabbitmq" containerID="cri-o://01562c609d85e86bb08c30b79d913bf9f75b03afd4c54e6010518901b35c9807" gracePeriod=604796 Oct 08 22:12:09 crc kubenswrapper[4739]: I1008 22:12:09.721835 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 08 22:12:09 crc kubenswrapper[4739]: I1008 22:12:09.785574 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.179799 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.183487 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.205243 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.334087 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.334342 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrz77\" (UniqueName: \"kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.334582 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.437066 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.437225 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.437291 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrz77\" (UniqueName: \"kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.437910 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.438216 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.457744 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrz77\" (UniqueName: \"kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77\") pod \"certified-operators-vsds5\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.518133 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.986625 4739 generic.go:334] "Generic (PLEG): container finished" podID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerID="ed3bef55901da71ee502980a09f6088f4a2b5887918d42628f5e09c4667fbdf3" exitCode=0 Oct 08 22:12:14 crc kubenswrapper[4739]: I1008 22:12:14.986699 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerDied","Data":"ed3bef55901da71ee502980a09f6088f4a2b5887918d42628f5e09c4667fbdf3"} Oct 08 22:12:16 crc kubenswrapper[4739]: I1008 22:12:16.000592 4739 generic.go:334] "Generic (PLEG): container finished" podID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerID="01562c609d85e86bb08c30b79d913bf9f75b03afd4c54e6010518901b35c9807" exitCode=0 Oct 08 22:12:16 crc kubenswrapper[4739]: I1008 22:12:16.000656 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerDied","Data":"01562c609d85e86bb08c30b79d913bf9f75b03afd4c54e6010518901b35c9807"} Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.141928 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.144730 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.146449 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.159906 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216263 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216344 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216367 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2f9\" (UniqueName: \"kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216421 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216510 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.216549 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.317801 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.317876 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.317922 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.317964 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.318014 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.318032 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2f9\" (UniqueName: \"kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.318064 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.319332 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.319332 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.319433 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.319600 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.319684 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.320384 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.358185 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2f9\" (UniqueName: \"kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9\") pod \"dnsmasq-dns-67b789f86c-fldrn\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.472711 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:18 crc kubenswrapper[4739]: E1008 22:12:18.779554 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.59:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest" Oct 08 22:12:18 crc kubenswrapper[4739]: E1008 22:12:18.779866 4739 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.59:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest" Oct 08 22:12:18 crc kubenswrapper[4739]: E1008 22:12:18.779975 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.59:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh698h7bh57fh697h576hfch8h5c5hd6h568h7ch558h6dh56h5d6h55fh68h68bh9fh696h75h68fh66fh57fh5c4h55h54ch5dbh68bh644h8bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rfk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7dd83bb8-a102-4ba9-825a-1cf852094ace): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.842408 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.850824 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929711 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929776 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929812 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929836 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929865 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929922 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929937 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.929953 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930021 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930063 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930080 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930112 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930136 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930181 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bfk9\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930200 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930214 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930228 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930253 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930279 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls\") pod \"2909f95b-c276-43d0-93c0-18a78dbb974f\" (UID: \"2909f95b-c276-43d0-93c0-18a78dbb974f\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930308 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930324 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slb5l\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.930346 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd\") pod \"17a6aba1-44fd-4b83-95b2-002a60e2291b\" (UID: \"17a6aba1-44fd-4b83-95b2-002a60e2291b\") " Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.935328 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.938811 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.941744 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.942594 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.946329 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.947967 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.950241 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.950296 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info" (OuterVolumeSpecName: "pod-info") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.951778 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.956001 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.959955 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.984361 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.987413 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:18 crc kubenswrapper[4739]: I1008 22:12:18.987643 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9" (OuterVolumeSpecName: "kube-api-access-6bfk9") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "kube-api-access-6bfk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.000584 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l" (OuterVolumeSpecName: "kube-api-access-slb5l") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "kube-api-access-slb5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.002645 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info" (OuterVolumeSpecName: "pod-info") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032412 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032437 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032448 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032458 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slb5l\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-kube-api-access-slb5l\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032480 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032489 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032498 4739 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2909f95b-c276-43d0-93c0-18a78dbb974f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032506 4739 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17a6aba1-44fd-4b83-95b2-002a60e2291b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032514 4739 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032523 4739 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17a6aba1-44fd-4b83-95b2-002a60e2291b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032532 4739 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2909f95b-c276-43d0-93c0-18a78dbb974f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032539 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032547 4739 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032555 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bfk9\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-kube-api-access-6bfk9\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032564 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.032576 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.050837 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.050923 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2909f95b-c276-43d0-93c0-18a78dbb974f","Type":"ContainerDied","Data":"5539d51bb745d52a372bfc1118ae17f031ceb2e6ab59d8d875a373818e393084"} Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.050975 4739 scope.go:117] "RemoveContainer" containerID="ed3bef55901da71ee502980a09f6088f4a2b5887918d42628f5e09c4667fbdf3" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.070059 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17a6aba1-44fd-4b83-95b2-002a60e2291b","Type":"ContainerDied","Data":"aef8fed2608376526a808c7ad0f04aa2f9be775c3745e15b3ec84e8a42755479"} Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.070156 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.085470 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.088602 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data" (OuterVolumeSpecName: "config-data") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.109113 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.133218 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf" (OuterVolumeSpecName: "server-conf") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.134957 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.134982 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.134992 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.135000 4739 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2909f95b-c276-43d0-93c0-18a78dbb974f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.142883 4739 scope.go:117] "RemoveContainer" containerID="0f3d74649a0a14550547d1e1b433c90cc6ac77609fefff35fec33655f62131a6" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.192031 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data" (OuterVolumeSpecName: "config-data") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.198969 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.205539 4739 scope.go:117] "RemoveContainer" containerID="01562c609d85e86bb08c30b79d913bf9f75b03afd4c54e6010518901b35c9807" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.237137 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17a6aba1-44fd-4b83-95b2-002a60e2291b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.237201 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.249320 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf" (OuterVolumeSpecName: "server-conf") pod "17a6aba1-44fd-4b83-95b2-002a60e2291b" (UID: "17a6aba1-44fd-4b83-95b2-002a60e2291b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.250055 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2909f95b-c276-43d0-93c0-18a78dbb974f" (UID: "2909f95b-c276-43d0-93c0-18a78dbb974f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.259511 4739 scope.go:117] "RemoveContainer" containerID="8461e62c94cfe5fa13ef8ece05f1f0dd7b1bea2f3dd5a46cf4358cc585e53964" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.302974 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:19 crc kubenswrapper[4739]: W1008 22:12:19.331579 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad2fa1fa_687d_4181_b11b_e86adff4ec9f.slice/crio-d796e79a8fb347b0f532e1cc3896e266385c24cd701307aab74146c7fee8c4d8 WatchSource:0}: Error finding container d796e79a8fb347b0f532e1cc3896e266385c24cd701307aab74146c7fee8c4d8: Status 404 returned error can't find the container with id d796e79a8fb347b0f532e1cc3896e266385c24cd701307aab74146c7fee8c4d8 Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.332323 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.338738 4739 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2909f95b-c276-43d0-93c0-18a78dbb974f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.338762 4739 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17a6aba1-44fd-4b83-95b2-002a60e2291b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.612927 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.628917 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.640518 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: E1008 22:12:19.641106 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641189 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: E1008 22:12:19.641253 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641306 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: E1008 22:12:19.641376 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="setup-container" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641426 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="setup-container" Oct 08 22:12:19 crc kubenswrapper[4739]: E1008 22:12:19.641487 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="setup-container" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641534 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="setup-container" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641761 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.641833 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" containerName="rabbitmq" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.642954 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.650699 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.663569 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.663651 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-97fhr" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.664371 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.664545 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.663688 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.663732 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.663755 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.679493 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.703398 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.715892 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.718454 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.724816 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725157 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725278 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725298 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725435 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9w9vm" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725523 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.725528 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.728466 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.833949 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a6aba1-44fd-4b83-95b2-002a60e2291b" path="/var/lib/kubelet/pods/17a6aba1-44fd-4b83-95b2-002a60e2291b/volumes" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.836332 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2909f95b-c276-43d0-93c0-18a78dbb974f" path="/var/lib/kubelet/pods/2909f95b-c276-43d0-93c0-18a78dbb974f/volumes" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859306 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859343 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859364 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859487 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859614 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859635 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859651 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859666 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/744c6598-a814-45c7-bf47-5fe0b5b48c5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859679 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859700 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/744c6598-a814-45c7-bf47-5fe0b5b48c5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859718 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859736 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859779 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859812 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859857 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859885 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh8hk\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-kube-api-access-vh8hk\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859916 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859942 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859958 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2bds\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-kube-api-access-n2bds\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.859983 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.860006 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.962051 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.962399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2bds\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-kube-api-access-n2bds\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.962596 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.962767 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.962910 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963059 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963189 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963343 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963484 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963648 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963771 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963894 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964014 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964129 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/744c6598-a814-45c7-bf47-5fe0b5b48c5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964360 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964505 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/744c6598-a814-45c7-bf47-5fe0b5b48c5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964647 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964799 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964965 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964441 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964812 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-config-data\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963559 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.964683 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.963916 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.966815 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.966939 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.966988 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967014 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh8hk\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-kube-api-access-vh8hk\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967077 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967086 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967634 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967648 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967715 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.967822 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.968050 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.968284 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/744c6598-a814-45c7-bf47-5fe0b5b48c5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.974031 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.981574 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/744c6598-a814-45c7-bf47-5fe0b5b48c5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.981803 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/744c6598-a814-45c7-bf47-5fe0b5b48c5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.982952 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.985303 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.985565 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.995056 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh8hk\" (UniqueName: \"kubernetes.io/projected/f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb-kube-api-access-vh8hk\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:19 crc kubenswrapper[4739]: I1008 22:12:19.997327 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2bds\" (UniqueName: \"kubernetes.io/projected/744c6598-a814-45c7-bf47-5fe0b5b48c5e-kube-api-access-n2bds\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.021450 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"744c6598-a814-45c7-bf47-5fe0b5b48c5e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.029539 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb\") " pod="openstack/rabbitmq-server-0" Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.046793 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.089386 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" event={"ID":"ad2fa1fa-687d-4181-b11b-e86adff4ec9f","Type":"ContainerStarted","Data":"d796e79a8fb347b0f532e1cc3896e266385c24cd701307aab74146c7fee8c4d8"} Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.092664 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerStarted","Data":"c20099748f37c3c37afb1f37f800c143ea9ff51f5562800b8fecc85931b54ab9"} Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.309525 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.683811 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:12:20 crc kubenswrapper[4739]: I1008 22:12:20.824456 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:12:20 crc kubenswrapper[4739]: W1008 22:12:20.826883 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744c6598_a814_45c7_bf47_5fe0b5b48c5e.slice/crio-41f904b9b29a99150514d467987e411e030d3818e8f6ef17b26330bb60811aff WatchSource:0}: Error finding container 41f904b9b29a99150514d467987e411e030d3818e8f6ef17b26330bb60811aff: Status 404 returned error can't find the container with id 41f904b9b29a99150514d467987e411e030d3818e8f6ef17b26330bb60811aff Oct 08 22:12:21 crc kubenswrapper[4739]: I1008 22:12:21.108250 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb","Type":"ContainerStarted","Data":"0d34ef6aae20b10f2f5ea086a264a06ae77ab264e6b07b5e417e724648f2bd0d"} Oct 08 22:12:21 crc kubenswrapper[4739]: I1008 22:12:21.109856 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"744c6598-a814-45c7-bf47-5fe0b5b48c5e","Type":"ContainerStarted","Data":"41f904b9b29a99150514d467987e411e030d3818e8f6ef17b26330bb60811aff"} Oct 08 22:12:21 crc kubenswrapper[4739]: I1008 22:12:21.768640 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:12:21 crc kubenswrapper[4739]: I1008 22:12:21.768700 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:12:22 crc kubenswrapper[4739]: I1008 22:12:22.126336 4739 generic.go:334] "Generic (PLEG): container finished" podID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerID="dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6" exitCode=0 Oct 08 22:12:22 crc kubenswrapper[4739]: I1008 22:12:22.126405 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerDied","Data":"dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6"} Oct 08 22:12:22 crc kubenswrapper[4739]: I1008 22:12:22.128625 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerID="8980c3e8a2cd74695cbdfc04898b1fa12e1f5c07a3d8092778c460f935844e87" exitCode=0 Oct 08 22:12:22 crc kubenswrapper[4739]: I1008 22:12:22.128652 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" event={"ID":"ad2fa1fa-687d-4181-b11b-e86adff4ec9f","Type":"ContainerDied","Data":"8980c3e8a2cd74695cbdfc04898b1fa12e1f5c07a3d8092778c460f935844e87"} Oct 08 22:12:23 crc kubenswrapper[4739]: I1008 22:12:23.155355 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"744c6598-a814-45c7-bf47-5fe0b5b48c5e","Type":"ContainerStarted","Data":"096e89fa2568a25b5962085c6eca60b5fa3b8464d61ef664f159dac0ea03138e"} Oct 08 22:12:23 crc kubenswrapper[4739]: I1008 22:12:23.161609 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb","Type":"ContainerStarted","Data":"046098d5f02c3aef1b4444ec9cb0ccde77f2877db1bb62b13b28146c079cde34"} Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.181793 4739 generic.go:334] "Generic (PLEG): container finished" podID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerID="0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f" exitCode=0 Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.182949 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerDied","Data":"0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f"} Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.186766 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" event={"ID":"ad2fa1fa-687d-4181-b11b-e86adff4ec9f","Type":"ContainerStarted","Data":"e32933f81a85da4b8881d2b774fd3d97ae418556a623fd91f5fb44962714f775"} Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.186954 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.196579 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerStarted","Data":"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3"} Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.196817 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerStarted","Data":"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39"} Oct 08 22:12:24 crc kubenswrapper[4739]: I1008 22:12:24.245651 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" podStartSLOduration=6.245628451 podStartE2EDuration="6.245628451s" podCreationTimestamp="2025-10-08 22:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:12:24.238413113 +0000 UTC m=+1444.063798873" watchObservedRunningTime="2025-10-08 22:12:24.245628451 +0000 UTC m=+1444.071014211" Oct 08 22:12:25 crc kubenswrapper[4739]: I1008 22:12:25.208258 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerStarted","Data":"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0"} Oct 08 22:12:25 crc kubenswrapper[4739]: I1008 22:12:25.241705 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsds5" podStartSLOduration=8.696045373 podStartE2EDuration="11.241687621s" podCreationTimestamp="2025-10-08 22:12:14 +0000 UTC" firstStartedPulling="2025-10-08 22:12:22.13000219 +0000 UTC m=+1441.955387950" lastFinishedPulling="2025-10-08 22:12:24.675644418 +0000 UTC m=+1444.501030198" observedRunningTime="2025-10-08 22:12:25.23754641 +0000 UTC m=+1445.062932170" watchObservedRunningTime="2025-10-08 22:12:25.241687621 +0000 UTC m=+1445.067073381" Oct 08 22:12:26 crc kubenswrapper[4739]: E1008 22:12:26.015120 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" Oct 08 22:12:26 crc kubenswrapper[4739]: I1008 22:12:26.221950 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerStarted","Data":"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b"} Oct 08 22:12:26 crc kubenswrapper[4739]: I1008 22:12:26.222126 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:12:26 crc kubenswrapper[4739]: E1008 22:12:26.223538 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.59:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest\\\"\"" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" Oct 08 22:12:27 crc kubenswrapper[4739]: E1008 22:12:27.237827 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.59:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest\\\"\"" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.475356 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.538080 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.538484 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="dnsmasq-dns" containerID="cri-o://04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869" gracePeriod=10 Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.736586 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-nnbsx"] Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.738525 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.757552 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-nnbsx"] Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812644 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812720 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-config\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812747 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjch\" (UniqueName: \"kubernetes.io/projected/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-kube-api-access-kzjch\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812809 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812845 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812892 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.812925 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914619 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914692 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-config\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914719 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjch\" (UniqueName: \"kubernetes.io/projected/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-kube-api-access-kzjch\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914787 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914810 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914854 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.914879 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.916311 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.916805 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-config\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.917822 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.918759 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.919062 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.919902 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:28 crc kubenswrapper[4739]: I1008 22:12:28.935265 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjch\" (UniqueName: \"kubernetes.io/projected/b7adc6ab-b111-4d2a-a0f3-a1b50e53df52-kube-api-access-kzjch\") pod \"dnsmasq-dns-cb6ffcf87-nnbsx\" (UID: \"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52\") " pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.035792 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.067356 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118093 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118196 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118226 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118265 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118308 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9b8p\" (UniqueName: \"kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.118392 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb\") pod \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\" (UID: \"f35b9a8c-8e6f-4170-b1d9-20c4628d478b\") " Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.123400 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p" (OuterVolumeSpecName: "kube-api-access-p9b8p") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "kube-api-access-p9b8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.181712 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config" (OuterVolumeSpecName: "config") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.199694 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.205897 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.207645 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.225102 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.225127 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.225138 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.225162 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9b8p\" (UniqueName: \"kubernetes.io/projected/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-kube-api-access-p9b8p\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.225170 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.232181 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f35b9a8c-8e6f-4170-b1d9-20c4628d478b" (UID: "f35b9a8c-8e6f-4170-b1d9-20c4628d478b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.254411 4739 generic.go:334] "Generic (PLEG): container finished" podID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerID="04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869" exitCode=0 Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.254458 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerDied","Data":"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869"} Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.254486 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" event={"ID":"f35b9a8c-8e6f-4170-b1d9-20c4628d478b","Type":"ContainerDied","Data":"324baef53a83316e6711f830ea64e8e97adcfd52c9b58cb2c19deef117214a7a"} Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.254458 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-bnpvs" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.254504 4739 scope.go:117] "RemoveContainer" containerID="04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.272420 4739 scope.go:117] "RemoveContainer" containerID="d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.295051 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.295902 4739 scope.go:117] "RemoveContainer" containerID="04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869" Oct 08 22:12:29 crc kubenswrapper[4739]: E1008 22:12:29.296353 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869\": container with ID starting with 04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869 not found: ID does not exist" containerID="04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.296388 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869"} err="failed to get container status \"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869\": rpc error: code = NotFound desc = could not find container \"04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869\": container with ID starting with 04e71fff73dcd0bef56de7306621d231ecc8010bc1bd2453b4ce6a2abe0db869 not found: ID does not exist" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.296410 4739 scope.go:117] "RemoveContainer" containerID="d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d" Oct 08 22:12:29 crc kubenswrapper[4739]: E1008 22:12:29.297025 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d\": container with ID starting with d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d not found: ID does not exist" containerID="d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.297045 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d"} err="failed to get container status \"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d\": rpc error: code = NotFound desc = could not find container \"d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d\": container with ID starting with d8618bc7ce4f96433111462db9595702f176d2448ff96c5bb0e727e1a61a1e8d not found: ID does not exist" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.302696 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-bnpvs"] Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.326444 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f35b9a8c-8e6f-4170-b1d9-20c4628d478b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.507435 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-nnbsx"] Oct 08 22:12:29 crc kubenswrapper[4739]: I1008 22:12:29.838794 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" path="/var/lib/kubelet/pods/f35b9a8c-8e6f-4170-b1d9-20c4628d478b/volumes" Oct 08 22:12:30 crc kubenswrapper[4739]: I1008 22:12:30.265423 4739 generic.go:334] "Generic (PLEG): container finished" podID="b7adc6ab-b111-4d2a-a0f3-a1b50e53df52" containerID="205e5f3dce63431a3b3a523bd34ff08ff2e23fa6182dc1491992e641d0b9f1a1" exitCode=0 Oct 08 22:12:30 crc kubenswrapper[4739]: I1008 22:12:30.265463 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" event={"ID":"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52","Type":"ContainerDied","Data":"205e5f3dce63431a3b3a523bd34ff08ff2e23fa6182dc1491992e641d0b9f1a1"} Oct 08 22:12:30 crc kubenswrapper[4739]: I1008 22:12:30.265489 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" event={"ID":"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52","Type":"ContainerStarted","Data":"76d73a9d39c4c149e7eacf224f6a96a70d1a809891f5096ae1940f9f026141bf"} Oct 08 22:12:31 crc kubenswrapper[4739]: I1008 22:12:31.275724 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" event={"ID":"b7adc6ab-b111-4d2a-a0f3-a1b50e53df52","Type":"ContainerStarted","Data":"dde6a3e55d100cc1675ad094e2b715249930dcc301745795591737ff59b57184"} Oct 08 22:12:31 crc kubenswrapper[4739]: I1008 22:12:31.276026 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:31 crc kubenswrapper[4739]: I1008 22:12:31.305603 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" podStartSLOduration=3.305580194 podStartE2EDuration="3.305580194s" podCreationTimestamp="2025-10-08 22:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:12:31.290042183 +0000 UTC m=+1451.115427973" watchObservedRunningTime="2025-10-08 22:12:31.305580194 +0000 UTC m=+1451.130965944" Oct 08 22:12:34 crc kubenswrapper[4739]: I1008 22:12:34.518716 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:34 crc kubenswrapper[4739]: I1008 22:12:34.519098 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:34 crc kubenswrapper[4739]: I1008 22:12:34.576955 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:35 crc kubenswrapper[4739]: I1008 22:12:35.445224 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:35 crc kubenswrapper[4739]: I1008 22:12:35.562286 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:37 crc kubenswrapper[4739]: I1008 22:12:37.366658 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsds5" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="registry-server" containerID="cri-o://e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0" gracePeriod=2 Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.367086 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.376678 4739 generic.go:334] "Generic (PLEG): container finished" podID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerID="e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0" exitCode=0 Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.376717 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerDied","Data":"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0"} Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.376742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsds5" event={"ID":"26d6bf28-f715-441b-b4d3-4a3f18d1df19","Type":"ContainerDied","Data":"c20099748f37c3c37afb1f37f800c143ea9ff51f5562800b8fecc85931b54ab9"} Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.376758 4739 scope.go:117] "RemoveContainer" containerID="e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.376886 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsds5" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.414781 4739 scope.go:117] "RemoveContainer" containerID="0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.434085 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrz77\" (UniqueName: \"kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77\") pod \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.434136 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content\") pod \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.434245 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities\") pod \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\" (UID: \"26d6bf28-f715-441b-b4d3-4a3f18d1df19\") " Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.435611 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities" (OuterVolumeSpecName: "utilities") pod "26d6bf28-f715-441b-b4d3-4a3f18d1df19" (UID: "26d6bf28-f715-441b-b4d3-4a3f18d1df19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.440855 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77" (OuterVolumeSpecName: "kube-api-access-qrz77") pod "26d6bf28-f715-441b-b4d3-4a3f18d1df19" (UID: "26d6bf28-f715-441b-b4d3-4a3f18d1df19"). InnerVolumeSpecName "kube-api-access-qrz77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.443691 4739 scope.go:117] "RemoveContainer" containerID="dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.492915 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26d6bf28-f715-441b-b4d3-4a3f18d1df19" (UID: "26d6bf28-f715-441b-b4d3-4a3f18d1df19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.526110 4739 scope.go:117] "RemoveContainer" containerID="e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0" Oct 08 22:12:38 crc kubenswrapper[4739]: E1008 22:12:38.527725 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0\": container with ID starting with e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0 not found: ID does not exist" containerID="e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.527762 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0"} err="failed to get container status \"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0\": rpc error: code = NotFound desc = could not find container \"e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0\": container with ID starting with e1623e77b37a841a5a5ba4a0943066564e0437afb3b8dcd4d7f90e71a04106e0 not found: ID does not exist" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.527781 4739 scope.go:117] "RemoveContainer" containerID="0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f" Oct 08 22:12:38 crc kubenswrapper[4739]: E1008 22:12:38.528101 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f\": container with ID starting with 0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f not found: ID does not exist" containerID="0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.528127 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f"} err="failed to get container status \"0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f\": rpc error: code = NotFound desc = could not find container \"0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f\": container with ID starting with 0735add0789b8b5abf983e574aa1143ff1161cedef28ebb73fb0e34daf52e80f not found: ID does not exist" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.528155 4739 scope.go:117] "RemoveContainer" containerID="dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6" Oct 08 22:12:38 crc kubenswrapper[4739]: E1008 22:12:38.528609 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6\": container with ID starting with dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6 not found: ID does not exist" containerID="dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.528742 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6"} err="failed to get container status \"dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6\": rpc error: code = NotFound desc = could not find container \"dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6\": container with ID starting with dbb5f97cc393b825392d34b9e3b86970dbc463a81be5bda89317cd036eb001b6 not found: ID does not exist" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.536478 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrz77\" (UniqueName: \"kubernetes.io/projected/26d6bf28-f715-441b-b4d3-4a3f18d1df19-kube-api-access-qrz77\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.536624 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.536717 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d6bf28-f715-441b-b4d3-4a3f18d1df19-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.713701 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.721416 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsds5"] Oct 08 22:12:38 crc kubenswrapper[4739]: I1008 22:12:38.834679 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.069476 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-nnbsx" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.136564 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.136839 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="dnsmasq-dns" containerID="cri-o://e32933f81a85da4b8881d2b774fd3d97ae418556a623fd91f5fb44962714f775" gracePeriod=10 Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.387936 4739 generic.go:334] "Generic (PLEG): container finished" podID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerID="e32933f81a85da4b8881d2b774fd3d97ae418556a623fd91f5fb44962714f775" exitCode=0 Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.388021 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" event={"ID":"ad2fa1fa-687d-4181-b11b-e86adff4ec9f","Type":"ContainerDied","Data":"e32933f81a85da4b8881d2b774fd3d97ae418556a623fd91f5fb44962714f775"} Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.390586 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerStarted","Data":"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee"} Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.420661 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.627431584 podStartE2EDuration="34.420638256s" podCreationTimestamp="2025-10-08 22:12:05 +0000 UTC" firstStartedPulling="2025-10-08 22:12:06.114702209 +0000 UTC m=+1425.940087959" lastFinishedPulling="2025-10-08 22:12:38.907908881 +0000 UTC m=+1458.733294631" observedRunningTime="2025-10-08 22:12:39.41343029 +0000 UTC m=+1459.238816040" watchObservedRunningTime="2025-10-08 22:12:39.420638256 +0000 UTC m=+1459.246024006" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.606656 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.668997 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w2f9\" (UniqueName: \"kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669077 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669216 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669270 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669298 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669402 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.669453 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb\") pod \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\" (UID: \"ad2fa1fa-687d-4181-b11b-e86adff4ec9f\") " Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.675535 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9" (OuterVolumeSpecName: "kube-api-access-7w2f9") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "kube-api-access-7w2f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.718929 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config" (OuterVolumeSpecName: "config") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.727949 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.730665 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.740360 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.743240 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.755544 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ad2fa1fa-687d-4181-b11b-e86adff4ec9f" (UID: "ad2fa1fa-687d-4181-b11b-e86adff4ec9f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.771932 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w2f9\" (UniqueName: \"kubernetes.io/projected/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-kube-api-access-7w2f9\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.771963 4739 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.771972 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.771981 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.771989 4739 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.772007 4739 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.772016 4739 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad2fa1fa-687d-4181-b11b-e86adff4ec9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:12:39 crc kubenswrapper[4739]: I1008 22:12:39.833548 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" path="/var/lib/kubelet/pods/26d6bf28-f715-441b-b4d3-4a3f18d1df19/volumes" Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.405915 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" event={"ID":"ad2fa1fa-687d-4181-b11b-e86adff4ec9f","Type":"ContainerDied","Data":"d796e79a8fb347b0f532e1cc3896e266385c24cd701307aab74146c7fee8c4d8"} Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.405961 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-fldrn" Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.405985 4739 scope.go:117] "RemoveContainer" containerID="e32933f81a85da4b8881d2b774fd3d97ae418556a623fd91f5fb44962714f775" Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.437510 4739 scope.go:117] "RemoveContainer" containerID="8980c3e8a2cd74695cbdfc04898b1fa12e1f5c07a3d8092778c460f935844e87" Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.445457 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:40 crc kubenswrapper[4739]: I1008 22:12:40.456991 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-fldrn"] Oct 08 22:12:41 crc kubenswrapper[4739]: I1008 22:12:41.833895 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" path="/var/lib/kubelet/pods/ad2fa1fa-687d-4181-b11b-e86adff4ec9f/volumes" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.116454 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr"] Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117784 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117805 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117829 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="extract-utilities" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117842 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="extract-utilities" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117865 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="init" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117878 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="init" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117897 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="extract-content" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117912 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="extract-content" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117936 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="registry-server" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117948 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="registry-server" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.117984 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="init" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.117996 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="init" Oct 08 22:12:48 crc kubenswrapper[4739]: E1008 22:12:48.118056 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.118068 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.118434 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2fa1fa-687d-4181-b11b-e86adff4ec9f" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.118461 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d6bf28-f715-441b-b4d3-4a3f18d1df19" containerName="registry-server" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.118482 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35b9a8c-8e6f-4170-b1d9-20c4628d478b" containerName="dnsmasq-dns" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.119661 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.124604 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.124691 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.129504 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.149356 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.152141 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr"] Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.268783 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.268859 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.268924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdf26\" (UniqueName: \"kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.268984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.371958 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.372079 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.372200 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdf26\" (UniqueName: \"kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.372287 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.381035 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.381104 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.382342 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.402894 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdf26\" (UniqueName: \"kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.455615 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:12:48 crc kubenswrapper[4739]: I1008 22:12:48.898123 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr"] Oct 08 22:12:49 crc kubenswrapper[4739]: I1008 22:12:49.518456 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" event={"ID":"5536fb34-0051-4845-98e8-050b8870274d","Type":"ContainerStarted","Data":"43b2859b18c3fdb906e1fd91ddfa979df555326ba5845e182f923517c79fbc12"} Oct 08 22:12:51 crc kubenswrapper[4739]: I1008 22:12:51.766770 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:12:51 crc kubenswrapper[4739]: I1008 22:12:51.767357 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:12:51 crc kubenswrapper[4739]: I1008 22:12:51.767428 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:12:51 crc kubenswrapper[4739]: I1008 22:12:51.768476 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:12:51 crc kubenswrapper[4739]: I1008 22:12:51.768561 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf" gracePeriod=600 Oct 08 22:12:52 crc kubenswrapper[4739]: I1008 22:12:52.563325 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf" exitCode=0 Oct 08 22:12:52 crc kubenswrapper[4739]: I1008 22:12:52.563417 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf"} Oct 08 22:12:52 crc kubenswrapper[4739]: I1008 22:12:52.563965 4739 scope.go:117] "RemoveContainer" containerID="f263d906c5336884f5cafca08187af555d27f85843b3fe64b88ee6f01fe93ba9" Oct 08 22:12:55 crc kubenswrapper[4739]: I1008 22:12:55.600299 4739 generic.go:334] "Generic (PLEG): container finished" podID="744c6598-a814-45c7-bf47-5fe0b5b48c5e" containerID="096e89fa2568a25b5962085c6eca60b5fa3b8464d61ef664f159dac0ea03138e" exitCode=0 Oct 08 22:12:55 crc kubenswrapper[4739]: I1008 22:12:55.600402 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"744c6598-a814-45c7-bf47-5fe0b5b48c5e","Type":"ContainerDied","Data":"096e89fa2568a25b5962085c6eca60b5fa3b8464d61ef664f159dac0ea03138e"} Oct 08 22:12:55 crc kubenswrapper[4739]: I1008 22:12:55.607178 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb","Type":"ContainerDied","Data":"046098d5f02c3aef1b4444ec9cb0ccde77f2877db1bb62b13b28146c079cde34"} Oct 08 22:12:55 crc kubenswrapper[4739]: I1008 22:12:55.607141 4739 generic.go:334] "Generic (PLEG): container finished" podID="f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb" containerID="046098d5f02c3aef1b4444ec9cb0ccde77f2877db1bb62b13b28146c079cde34" exitCode=0 Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.662052 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06"} Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.666422 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"744c6598-a814-45c7-bf47-5fe0b5b48c5e","Type":"ContainerStarted","Data":"627ed8adb5c9ab4d96ec99c45f9428b3080d049beb0a9f205b3daa5ae1dd6c69"} Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.666772 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.668338 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb","Type":"ContainerStarted","Data":"32be83183f8d7eefddd1aaf6b62537d051b8fe0d3214a8a587ec084d54f8a218"} Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.668667 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.670658 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" event={"ID":"5536fb34-0051-4845-98e8-050b8870274d","Type":"ContainerStarted","Data":"ecf1d155ff54fd93d1155ac01cc38e3c46438a069e7dda542a2ae1a3a01a0179"} Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.738945 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" podStartSLOduration=1.437557704 podStartE2EDuration="11.738914532s" podCreationTimestamp="2025-10-08 22:12:48 +0000 UTC" firstStartedPulling="2025-10-08 22:12:48.906681753 +0000 UTC m=+1468.732067503" lastFinishedPulling="2025-10-08 22:12:59.208038571 +0000 UTC m=+1479.033424331" observedRunningTime="2025-10-08 22:12:59.702376516 +0000 UTC m=+1479.527762266" watchObservedRunningTime="2025-10-08 22:12:59.738914532 +0000 UTC m=+1479.564300292" Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.739392 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.739385124 podStartE2EDuration="40.739385124s" podCreationTimestamp="2025-10-08 22:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:12:59.72822369 +0000 UTC m=+1479.553609460" watchObservedRunningTime="2025-10-08 22:12:59.739385124 +0000 UTC m=+1479.564770884" Oct 08 22:12:59 crc kubenswrapper[4739]: I1008 22:12:59.760781 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.760746148 podStartE2EDuration="40.760746148s" podCreationTimestamp="2025-10-08 22:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:12:59.755466808 +0000 UTC m=+1479.580852568" watchObservedRunningTime="2025-10-08 22:12:59.760746148 +0000 UTC m=+1479.586131898" Oct 08 22:13:10 crc kubenswrapper[4739]: I1008 22:13:10.051349 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 22:13:10 crc kubenswrapper[4739]: I1008 22:13:10.312350 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:13:11 crc kubenswrapper[4739]: I1008 22:13:11.826210 4739 generic.go:334] "Generic (PLEG): container finished" podID="5536fb34-0051-4845-98e8-050b8870274d" containerID="ecf1d155ff54fd93d1155ac01cc38e3c46438a069e7dda542a2ae1a3a01a0179" exitCode=0 Oct 08 22:13:11 crc kubenswrapper[4739]: I1008 22:13:11.834688 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" event={"ID":"5536fb34-0051-4845-98e8-050b8870274d","Type":"ContainerDied","Data":"ecf1d155ff54fd93d1155ac01cc38e3c46438a069e7dda542a2ae1a3a01a0179"} Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.339760 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.439925 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdf26\" (UniqueName: \"kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26\") pod \"5536fb34-0051-4845-98e8-050b8870274d\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.440110 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory\") pod \"5536fb34-0051-4845-98e8-050b8870274d\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.440283 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle\") pod \"5536fb34-0051-4845-98e8-050b8870274d\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.440397 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key\") pod \"5536fb34-0051-4845-98e8-050b8870274d\" (UID: \"5536fb34-0051-4845-98e8-050b8870274d\") " Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.446980 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26" (OuterVolumeSpecName: "kube-api-access-gdf26") pod "5536fb34-0051-4845-98e8-050b8870274d" (UID: "5536fb34-0051-4845-98e8-050b8870274d"). InnerVolumeSpecName "kube-api-access-gdf26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.447992 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5536fb34-0051-4845-98e8-050b8870274d" (UID: "5536fb34-0051-4845-98e8-050b8870274d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.470546 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory" (OuterVolumeSpecName: "inventory") pod "5536fb34-0051-4845-98e8-050b8870274d" (UID: "5536fb34-0051-4845-98e8-050b8870274d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.471951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5536fb34-0051-4845-98e8-050b8870274d" (UID: "5536fb34-0051-4845-98e8-050b8870274d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.544022 4739 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.544231 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.544255 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdf26\" (UniqueName: \"kubernetes.io/projected/5536fb34-0051-4845-98e8-050b8870274d-kube-api-access-gdf26\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.544280 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5536fb34-0051-4845-98e8-050b8870274d-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.845906 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" event={"ID":"5536fb34-0051-4845-98e8-050b8870274d","Type":"ContainerDied","Data":"43b2859b18c3fdb906e1fd91ddfa979df555326ba5845e182f923517c79fbc12"} Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.845965 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b2859b18c3fdb906e1fd91ddfa979df555326ba5845e182f923517c79fbc12" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.845971 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.921784 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc"] Oct 08 22:13:13 crc kubenswrapper[4739]: E1008 22:13:13.922323 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5536fb34-0051-4845-98e8-050b8870274d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.922349 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="5536fb34-0051-4845-98e8-050b8870274d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.922600 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="5536fb34-0051-4845-98e8-050b8870274d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.923445 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.925716 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.925915 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.925968 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.926047 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:13:13 crc kubenswrapper[4739]: I1008 22:13:13.943629 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc"] Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.059880 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kggn\" (UniqueName: \"kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.060062 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.060185 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.162336 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.162508 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kggn\" (UniqueName: \"kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.162605 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.167976 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.171597 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.186335 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kggn\" (UniqueName: \"kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9wkc\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.238073 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.773224 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc"] Oct 08 22:13:14 crc kubenswrapper[4739]: W1008 22:13:14.784860 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode04c955f_d97c_4cc9_a01e_3d4d2b59de12.slice/crio-eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634 WatchSource:0}: Error finding container eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634: Status 404 returned error can't find the container with id eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634 Oct 08 22:13:14 crc kubenswrapper[4739]: I1008 22:13:14.854803 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" event={"ID":"e04c955f-d97c-4cc9-a01e-3d4d2b59de12","Type":"ContainerStarted","Data":"eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634"} Oct 08 22:13:16 crc kubenswrapper[4739]: I1008 22:13:16.882179 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" event={"ID":"e04c955f-d97c-4cc9-a01e-3d4d2b59de12","Type":"ContainerStarted","Data":"28bc172c443712a41bc80fd1fb4a3dddcb47c6656aff1b22d910534a7a3c9a9b"} Oct 08 22:13:16 crc kubenswrapper[4739]: I1008 22:13:16.906105 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" podStartSLOduration=2.785522046 podStartE2EDuration="3.906081701s" podCreationTimestamp="2025-10-08 22:13:13 +0000 UTC" firstStartedPulling="2025-10-08 22:13:14.790157113 +0000 UTC m=+1494.615542863" lastFinishedPulling="2025-10-08 22:13:15.910716728 +0000 UTC m=+1495.736102518" observedRunningTime="2025-10-08 22:13:16.901333825 +0000 UTC m=+1496.726719605" watchObservedRunningTime="2025-10-08 22:13:16.906081701 +0000 UTC m=+1496.731467481" Oct 08 22:13:18 crc kubenswrapper[4739]: I1008 22:13:18.902370 4739 generic.go:334] "Generic (PLEG): container finished" podID="e04c955f-d97c-4cc9-a01e-3d4d2b59de12" containerID="28bc172c443712a41bc80fd1fb4a3dddcb47c6656aff1b22d910534a7a3c9a9b" exitCode=0 Oct 08 22:13:18 crc kubenswrapper[4739]: I1008 22:13:18.902445 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" event={"ID":"e04c955f-d97c-4cc9-a01e-3d4d2b59de12","Type":"ContainerDied","Data":"28bc172c443712a41bc80fd1fb4a3dddcb47c6656aff1b22d910534a7a3c9a9b"} Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.446420 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.604840 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory\") pod \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.605615 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kggn\" (UniqueName: \"kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn\") pod \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.606288 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key\") pod \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\" (UID: \"e04c955f-d97c-4cc9-a01e-3d4d2b59de12\") " Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.613371 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn" (OuterVolumeSpecName: "kube-api-access-2kggn") pod "e04c955f-d97c-4cc9-a01e-3d4d2b59de12" (UID: "e04c955f-d97c-4cc9-a01e-3d4d2b59de12"). InnerVolumeSpecName "kube-api-access-2kggn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.648425 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory" (OuterVolumeSpecName: "inventory") pod "e04c955f-d97c-4cc9-a01e-3d4d2b59de12" (UID: "e04c955f-d97c-4cc9-a01e-3d4d2b59de12"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.656537 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e04c955f-d97c-4cc9-a01e-3d4d2b59de12" (UID: "e04c955f-d97c-4cc9-a01e-3d4d2b59de12"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.709355 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kggn\" (UniqueName: \"kubernetes.io/projected/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-kube-api-access-2kggn\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.709404 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.709425 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c955f-d97c-4cc9-a01e-3d4d2b59de12-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.944316 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" event={"ID":"e04c955f-d97c-4cc9-a01e-3d4d2b59de12","Type":"ContainerDied","Data":"eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634"} Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.944384 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff3eb2267c4f90c5a1e23c7ff86ce766d122056aad277ee9005915438bdf634" Oct 08 22:13:20 crc kubenswrapper[4739]: I1008 22:13:20.944508 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9wkc" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.039276 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7"] Oct 08 22:13:21 crc kubenswrapper[4739]: E1008 22:13:21.041664 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04c955f-d97c-4cc9-a01e-3d4d2b59de12" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.041689 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04c955f-d97c-4cc9-a01e-3d4d2b59de12" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.041925 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04c955f-d97c-4cc9-a01e-3d4d2b59de12" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.045990 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.049806 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.049943 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.050019 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.050077 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.058503 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7"] Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.119747 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.119813 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.119836 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpdsv\" (UniqueName: \"kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.119881 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.221644 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpdsv\" (UniqueName: \"kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.221758 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.221929 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.221969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.227662 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.228039 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.231440 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.254097 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpdsv\" (UniqueName: \"kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.369309 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:13:21 crc kubenswrapper[4739]: I1008 22:13:21.994018 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7"] Oct 08 22:13:22 crc kubenswrapper[4739]: I1008 22:13:22.584878 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:13:22 crc kubenswrapper[4739]: I1008 22:13:22.967196 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" event={"ID":"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b","Type":"ContainerStarted","Data":"c685c74601c88d67752f7157b318779d89753e38f08eb7aee814c1c3b0c8bc65"} Oct 08 22:13:22 crc kubenswrapper[4739]: I1008 22:13:22.967618 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" event={"ID":"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b","Type":"ContainerStarted","Data":"8a871f9d2b48f4290d7393ad920e68b34ab49cca9c8659058e2cdd9ba9e15ed8"} Oct 08 22:13:22 crc kubenswrapper[4739]: I1008 22:13:22.998331 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" podStartSLOduration=2.420315592 podStartE2EDuration="2.998311389s" podCreationTimestamp="2025-10-08 22:13:20 +0000 UTC" firstStartedPulling="2025-10-08 22:13:22.003500058 +0000 UTC m=+1501.828885818" lastFinishedPulling="2025-10-08 22:13:22.581495825 +0000 UTC m=+1502.406881615" observedRunningTime="2025-10-08 22:13:22.991573633 +0000 UTC m=+1502.816959383" watchObservedRunningTime="2025-10-08 22:13:22.998311389 +0000 UTC m=+1502.823697129" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.739020 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.743869 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.760283 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.806425 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56c7q\" (UniqueName: \"kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.806526 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.806665 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.908870 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56c7q\" (UniqueName: \"kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.908991 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.909125 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.909663 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.909830 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:24 crc kubenswrapper[4739]: I1008 22:13:24.948104 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56c7q\" (UniqueName: \"kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q\") pod \"redhat-marketplace-hff6c\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:25 crc kubenswrapper[4739]: I1008 22:13:25.089493 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:25 crc kubenswrapper[4739]: I1008 22:13:25.633267 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:26 crc kubenswrapper[4739]: I1008 22:13:26.021491 4739 generic.go:334] "Generic (PLEG): container finished" podID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerID="e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db" exitCode=0 Oct 08 22:13:26 crc kubenswrapper[4739]: I1008 22:13:26.021557 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerDied","Data":"e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db"} Oct 08 22:13:26 crc kubenswrapper[4739]: I1008 22:13:26.021619 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerStarted","Data":"08fac9f969323e23a87b49a73264bafb2252fb893d9fd137008413ecbfc447ba"} Oct 08 22:13:27 crc kubenswrapper[4739]: I1008 22:13:27.031887 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerStarted","Data":"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe"} Oct 08 22:13:28 crc kubenswrapper[4739]: I1008 22:13:28.043035 4739 generic.go:334] "Generic (PLEG): container finished" podID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerID="9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe" exitCode=0 Oct 08 22:13:28 crc kubenswrapper[4739]: I1008 22:13:28.043143 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerDied","Data":"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe"} Oct 08 22:13:29 crc kubenswrapper[4739]: I1008 22:13:29.055291 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerStarted","Data":"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda"} Oct 08 22:13:29 crc kubenswrapper[4739]: I1008 22:13:29.084789 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hff6c" podStartSLOduration=2.62039219 podStartE2EDuration="5.084762855s" podCreationTimestamp="2025-10-08 22:13:24 +0000 UTC" firstStartedPulling="2025-10-08 22:13:26.025334644 +0000 UTC m=+1505.850720424" lastFinishedPulling="2025-10-08 22:13:28.489705339 +0000 UTC m=+1508.315091089" observedRunningTime="2025-10-08 22:13:29.078863089 +0000 UTC m=+1508.904248849" watchObservedRunningTime="2025-10-08 22:13:29.084762855 +0000 UTC m=+1508.910148605" Oct 08 22:13:29 crc kubenswrapper[4739]: I1008 22:13:29.738269 4739 scope.go:117] "RemoveContainer" containerID="ab9a08a5cffaaaba12c20ad815ccf4ef630c59c414d9b5cd65dc8ce869144e59" Oct 08 22:13:35 crc kubenswrapper[4739]: I1008 22:13:35.090405 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:35 crc kubenswrapper[4739]: I1008 22:13:35.092492 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:35 crc kubenswrapper[4739]: I1008 22:13:35.143586 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:36 crc kubenswrapper[4739]: I1008 22:13:36.179597 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:36 crc kubenswrapper[4739]: I1008 22:13:36.234338 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.150967 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hff6c" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="registry-server" containerID="cri-o://061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda" gracePeriod=2 Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.663397 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.745960 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities\") pod \"4b84ef86-2b95-4175-b03f-42cdc7cae209\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.746109 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content\") pod \"4b84ef86-2b95-4175-b03f-42cdc7cae209\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.746284 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56c7q\" (UniqueName: \"kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q\") pod \"4b84ef86-2b95-4175-b03f-42cdc7cae209\" (UID: \"4b84ef86-2b95-4175-b03f-42cdc7cae209\") " Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.747036 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities" (OuterVolumeSpecName: "utilities") pod "4b84ef86-2b95-4175-b03f-42cdc7cae209" (UID: "4b84ef86-2b95-4175-b03f-42cdc7cae209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.754642 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q" (OuterVolumeSpecName: "kube-api-access-56c7q") pod "4b84ef86-2b95-4175-b03f-42cdc7cae209" (UID: "4b84ef86-2b95-4175-b03f-42cdc7cae209"). InnerVolumeSpecName "kube-api-access-56c7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.764639 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b84ef86-2b95-4175-b03f-42cdc7cae209" (UID: "4b84ef86-2b95-4175-b03f-42cdc7cae209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.849287 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56c7q\" (UniqueName: \"kubernetes.io/projected/4b84ef86-2b95-4175-b03f-42cdc7cae209-kube-api-access-56c7q\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.849331 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:38 crc kubenswrapper[4739]: I1008 22:13:38.849345 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b84ef86-2b95-4175-b03f-42cdc7cae209-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.179546 4739 generic.go:334] "Generic (PLEG): container finished" podID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerID="061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda" exitCode=0 Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.179629 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hff6c" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.179625 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerDied","Data":"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda"} Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.179782 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hff6c" event={"ID":"4b84ef86-2b95-4175-b03f-42cdc7cae209","Type":"ContainerDied","Data":"08fac9f969323e23a87b49a73264bafb2252fb893d9fd137008413ecbfc447ba"} Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.179807 4739 scope.go:117] "RemoveContainer" containerID="061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.231064 4739 scope.go:117] "RemoveContainer" containerID="9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.236606 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.249283 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hff6c"] Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.271448 4739 scope.go:117] "RemoveContainer" containerID="e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.301298 4739 scope.go:117] "RemoveContainer" containerID="061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda" Oct 08 22:13:39 crc kubenswrapper[4739]: E1008 22:13:39.301960 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda\": container with ID starting with 061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda not found: ID does not exist" containerID="061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.302098 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda"} err="failed to get container status \"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda\": rpc error: code = NotFound desc = could not find container \"061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda\": container with ID starting with 061c29b340179024cfe7ae8ef430528346a8fc43b1e3cdee8203efa73d0d4cda not found: ID does not exist" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.302219 4739 scope.go:117] "RemoveContainer" containerID="9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe" Oct 08 22:13:39 crc kubenswrapper[4739]: E1008 22:13:39.302717 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe\": container with ID starting with 9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe not found: ID does not exist" containerID="9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.302780 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe"} err="failed to get container status \"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe\": rpc error: code = NotFound desc = could not find container \"9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe\": container with ID starting with 9c3895b0905242c348ac031191575be30aecc2b518454cc736b968cbdfb17bfe not found: ID does not exist" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.302814 4739 scope.go:117] "RemoveContainer" containerID="e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db" Oct 08 22:13:39 crc kubenswrapper[4739]: E1008 22:13:39.303530 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db\": container with ID starting with e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db not found: ID does not exist" containerID="e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.303591 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db"} err="failed to get container status \"e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db\": rpc error: code = NotFound desc = could not find container \"e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db\": container with ID starting with e68bd7df42cc92c0c8595a1dfad48ac50a0509bdb0204ccdb5a3e62ff93725db not found: ID does not exist" Oct 08 22:13:39 crc kubenswrapper[4739]: E1008 22:13:39.394450 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b84ef86_2b95_4175_b03f_42cdc7cae209.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b84ef86_2b95_4175_b03f_42cdc7cae209.slice/crio-08fac9f969323e23a87b49a73264bafb2252fb893d9fd137008413ecbfc447ba\": RecentStats: unable to find data in memory cache]" Oct 08 22:13:39 crc kubenswrapper[4739]: I1008 22:13:39.838137 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" path="/var/lib/kubelet/pods/4b84ef86-2b95-4175-b03f-42cdc7cae209/volumes" Oct 08 22:14:29 crc kubenswrapper[4739]: I1008 22:14:29.861807 4739 scope.go:117] "RemoveContainer" containerID="33ba149770811cf8f6d0b06f8ac779b8ec2327cefbd40a418b21fc7450b41ede" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.171790 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq"] Oct 08 22:15:00 crc kubenswrapper[4739]: E1008 22:15:00.173322 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="extract-content" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.173347 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="extract-content" Oct 08 22:15:00 crc kubenswrapper[4739]: E1008 22:15:00.173384 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="extract-utilities" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.173398 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="extract-utilities" Oct 08 22:15:00 crc kubenswrapper[4739]: E1008 22:15:00.173437 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="registry-server" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.173451 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="registry-server" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.173823 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b84ef86-2b95-4175-b03f-42cdc7cae209" containerName="registry-server" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.175116 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.181433 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.181753 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.187691 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq"] Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.307877 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.308017 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvqf\" (UniqueName: \"kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.308097 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.409778 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.409924 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htvqf\" (UniqueName: \"kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.409983 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.410866 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.416659 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.435552 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htvqf\" (UniqueName: \"kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf\") pod \"collect-profiles-29332695-7wxxq\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:00 crc kubenswrapper[4739]: I1008 22:15:00.516918 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:01 crc kubenswrapper[4739]: I1008 22:15:01.053412 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq"] Oct 08 22:15:01 crc kubenswrapper[4739]: I1008 22:15:01.210741 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" event={"ID":"2c83ca3a-410d-4587-a721-2642fa984c8b","Type":"ContainerStarted","Data":"cf1911ac636c15dae5d56a6b5855f7dd5fe1f9974c7b5761893fa5675b0f6705"} Oct 08 22:15:01 crc kubenswrapper[4739]: E1008 22:15:01.699757 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c83ca3a_410d_4587_a721_2642fa984c8b.slice/crio-conmon-26c8a09d0824092f6762f38092a887c1d0cdbfc55c5fa6bf5c8f8679e3431ac6.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:15:02 crc kubenswrapper[4739]: I1008 22:15:02.221000 4739 generic.go:334] "Generic (PLEG): container finished" podID="2c83ca3a-410d-4587-a721-2642fa984c8b" containerID="26c8a09d0824092f6762f38092a887c1d0cdbfc55c5fa6bf5c8f8679e3431ac6" exitCode=0 Oct 08 22:15:02 crc kubenswrapper[4739]: I1008 22:15:02.221228 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" event={"ID":"2c83ca3a-410d-4587-a721-2642fa984c8b","Type":"ContainerDied","Data":"26c8a09d0824092f6762f38092a887c1d0cdbfc55c5fa6bf5c8f8679e3431ac6"} Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.696200 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.881815 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume\") pod \"2c83ca3a-410d-4587-a721-2642fa984c8b\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.881943 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htvqf\" (UniqueName: \"kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf\") pod \"2c83ca3a-410d-4587-a721-2642fa984c8b\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.882167 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume\") pod \"2c83ca3a-410d-4587-a721-2642fa984c8b\" (UID: \"2c83ca3a-410d-4587-a721-2642fa984c8b\") " Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.882762 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c83ca3a-410d-4587-a721-2642fa984c8b" (UID: "2c83ca3a-410d-4587-a721-2642fa984c8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.888967 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf" (OuterVolumeSpecName: "kube-api-access-htvqf") pod "2c83ca3a-410d-4587-a721-2642fa984c8b" (UID: "2c83ca3a-410d-4587-a721-2642fa984c8b"). InnerVolumeSpecName "kube-api-access-htvqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.891293 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c83ca3a-410d-4587-a721-2642fa984c8b" (UID: "2c83ca3a-410d-4587-a721-2642fa984c8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.984136 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c83ca3a-410d-4587-a721-2642fa984c8b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.984190 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c83ca3a-410d-4587-a721-2642fa984c8b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:03 crc kubenswrapper[4739]: I1008 22:15:03.984199 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htvqf\" (UniqueName: \"kubernetes.io/projected/2c83ca3a-410d-4587-a721-2642fa984c8b-kube-api-access-htvqf\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:04 crc kubenswrapper[4739]: I1008 22:15:04.245775 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" event={"ID":"2c83ca3a-410d-4587-a721-2642fa984c8b","Type":"ContainerDied","Data":"cf1911ac636c15dae5d56a6b5855f7dd5fe1f9974c7b5761893fa5675b0f6705"} Oct 08 22:15:04 crc kubenswrapper[4739]: I1008 22:15:04.245818 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1911ac636c15dae5d56a6b5855f7dd5fe1f9974c7b5761893fa5675b0f6705" Oct 08 22:15:04 crc kubenswrapper[4739]: I1008 22:15:04.245857 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.435658 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:18 crc kubenswrapper[4739]: E1008 22:15:18.438743 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c83ca3a-410d-4587-a721-2642fa984c8b" containerName="collect-profiles" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.438849 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c83ca3a-410d-4587-a721-2642fa984c8b" containerName="collect-profiles" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.439210 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c83ca3a-410d-4587-a721-2642fa984c8b" containerName="collect-profiles" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.441415 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.476433 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.523930 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jcd\" (UniqueName: \"kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.524066 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.524648 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.627335 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.627477 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jcd\" (UniqueName: \"kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.627521 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.629419 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.629880 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.652905 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jcd\" (UniqueName: \"kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd\") pod \"community-operators-kk7w6\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:18 crc kubenswrapper[4739]: I1008 22:15:18.784233 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:19 crc kubenswrapper[4739]: I1008 22:15:19.353236 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:19 crc kubenswrapper[4739]: I1008 22:15:19.433765 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerStarted","Data":"c788c248e0566517f7988d437735d7d4c95d69f99df38ac44dd733866b6c629c"} Oct 08 22:15:20 crc kubenswrapper[4739]: I1008 22:15:20.448454 4739 generic.go:334] "Generic (PLEG): container finished" podID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerID="85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d" exitCode=0 Oct 08 22:15:20 crc kubenswrapper[4739]: I1008 22:15:20.448646 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerDied","Data":"85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d"} Oct 08 22:15:20 crc kubenswrapper[4739]: I1008 22:15:20.452282 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:15:21 crc kubenswrapper[4739]: I1008 22:15:21.766002 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:15:21 crc kubenswrapper[4739]: I1008 22:15:21.767244 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:15:22 crc kubenswrapper[4739]: I1008 22:15:22.483441 4739 generic.go:334] "Generic (PLEG): container finished" podID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerID="b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3" exitCode=0 Oct 08 22:15:22 crc kubenswrapper[4739]: I1008 22:15:22.483598 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerDied","Data":"b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3"} Oct 08 22:15:24 crc kubenswrapper[4739]: I1008 22:15:24.510863 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerStarted","Data":"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786"} Oct 08 22:15:24 crc kubenswrapper[4739]: I1008 22:15:24.555333 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk7w6" podStartSLOduration=3.524471353 podStartE2EDuration="6.555292082s" podCreationTimestamp="2025-10-08 22:15:18 +0000 UTC" firstStartedPulling="2025-10-08 22:15:20.451842052 +0000 UTC m=+1620.277227802" lastFinishedPulling="2025-10-08 22:15:23.482662771 +0000 UTC m=+1623.308048531" observedRunningTime="2025-10-08 22:15:24.536428588 +0000 UTC m=+1624.361814328" watchObservedRunningTime="2025-10-08 22:15:24.555292082 +0000 UTC m=+1624.380677862" Oct 08 22:15:28 crc kubenswrapper[4739]: I1008 22:15:28.784554 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:28 crc kubenswrapper[4739]: I1008 22:15:28.785225 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:28 crc kubenswrapper[4739]: I1008 22:15:28.871669 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:29 crc kubenswrapper[4739]: I1008 22:15:29.663251 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:29 crc kubenswrapper[4739]: I1008 22:15:29.732839 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:31 crc kubenswrapper[4739]: I1008 22:15:31.600363 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk7w6" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="registry-server" containerID="cri-o://8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786" gracePeriod=2 Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.115100 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.189706 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities\") pod \"3156d56b-ba07-438c-91a5-21d0cd0848d5\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.189784 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jcd\" (UniqueName: \"kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd\") pod \"3156d56b-ba07-438c-91a5-21d0cd0848d5\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.189985 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content\") pod \"3156d56b-ba07-438c-91a5-21d0cd0848d5\" (UID: \"3156d56b-ba07-438c-91a5-21d0cd0848d5\") " Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.190546 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities" (OuterVolumeSpecName: "utilities") pod "3156d56b-ba07-438c-91a5-21d0cd0848d5" (UID: "3156d56b-ba07-438c-91a5-21d0cd0848d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.200384 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd" (OuterVolumeSpecName: "kube-api-access-m7jcd") pod "3156d56b-ba07-438c-91a5-21d0cd0848d5" (UID: "3156d56b-ba07-438c-91a5-21d0cd0848d5"). InnerVolumeSpecName "kube-api-access-m7jcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.239975 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3156d56b-ba07-438c-91a5-21d0cd0848d5" (UID: "3156d56b-ba07-438c-91a5-21d0cd0848d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.292412 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jcd\" (UniqueName: \"kubernetes.io/projected/3156d56b-ba07-438c-91a5-21d0cd0848d5-kube-api-access-m7jcd\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.292733 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.292743 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3156d56b-ba07-438c-91a5-21d0cd0848d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.617486 4739 generic.go:334] "Generic (PLEG): container finished" podID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerID="8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786" exitCode=0 Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.617539 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerDied","Data":"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786"} Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.617570 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk7w6" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.617611 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk7w6" event={"ID":"3156d56b-ba07-438c-91a5-21d0cd0848d5","Type":"ContainerDied","Data":"c788c248e0566517f7988d437735d7d4c95d69f99df38ac44dd733866b6c629c"} Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.617636 4739 scope.go:117] "RemoveContainer" containerID="8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.643557 4739 scope.go:117] "RemoveContainer" containerID="b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.662827 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.672810 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk7w6"] Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.682933 4739 scope.go:117] "RemoveContainer" containerID="85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.750988 4739 scope.go:117] "RemoveContainer" containerID="8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786" Oct 08 22:15:32 crc kubenswrapper[4739]: E1008 22:15:32.751407 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786\": container with ID starting with 8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786 not found: ID does not exist" containerID="8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.751446 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786"} err="failed to get container status \"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786\": rpc error: code = NotFound desc = could not find container \"8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786\": container with ID starting with 8e2aea17560fff6350f694e7dfb33b45141db8493dd4445c9c1f4c96ec618786 not found: ID does not exist" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.751482 4739 scope.go:117] "RemoveContainer" containerID="b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3" Oct 08 22:15:32 crc kubenswrapper[4739]: E1008 22:15:32.751793 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3\": container with ID starting with b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3 not found: ID does not exist" containerID="b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.751852 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3"} err="failed to get container status \"b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3\": rpc error: code = NotFound desc = could not find container \"b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3\": container with ID starting with b2792b21b6cd78855d132e499299370cdd6fcac60b8b43ff6cb2af6affe03fe3 not found: ID does not exist" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.751898 4739 scope.go:117] "RemoveContainer" containerID="85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d" Oct 08 22:15:32 crc kubenswrapper[4739]: E1008 22:15:32.752884 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d\": container with ID starting with 85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d not found: ID does not exist" containerID="85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d" Oct 08 22:15:32 crc kubenswrapper[4739]: I1008 22:15:32.752920 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d"} err="failed to get container status \"85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d\": rpc error: code = NotFound desc = could not find container \"85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d\": container with ID starting with 85ed0051b4de951cb04f835d39b07e474dfc25c5fc746bc4b38460905ce0b33d not found: ID does not exist" Oct 08 22:15:33 crc kubenswrapper[4739]: I1008 22:15:33.849017 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" path="/var/lib/kubelet/pods/3156d56b-ba07-438c-91a5-21d0cd0848d5/volumes" Oct 08 22:15:51 crc kubenswrapper[4739]: I1008 22:15:51.766028 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:15:51 crc kubenswrapper[4739]: I1008 22:15:51.766905 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:16:21 crc kubenswrapper[4739]: I1008 22:16:21.766219 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:16:21 crc kubenswrapper[4739]: I1008 22:16:21.768337 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:16:21 crc kubenswrapper[4739]: I1008 22:16:21.768454 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:16:21 crc kubenswrapper[4739]: I1008 22:16:21.769364 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:16:21 crc kubenswrapper[4739]: I1008 22:16:21.769508 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" gracePeriod=600 Oct 08 22:16:22 crc kubenswrapper[4739]: I1008 22:16:22.331517 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" exitCode=0 Oct 08 22:16:22 crc kubenswrapper[4739]: I1008 22:16:22.331595 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06"} Oct 08 22:16:22 crc kubenswrapper[4739]: I1008 22:16:22.332055 4739 scope.go:117] "RemoveContainer" containerID="f50fee8537e3d72c7912a6fb5efc59ba4c94366883a0b151d7314411b277cabf" Oct 08 22:16:22 crc kubenswrapper[4739]: E1008 22:16:22.520865 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:16:23 crc kubenswrapper[4739]: I1008 22:16:23.347559 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:16:23 crc kubenswrapper[4739]: E1008 22:16:23.349807 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:16:37 crc kubenswrapper[4739]: I1008 22:16:37.825335 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:16:37 crc kubenswrapper[4739]: E1008 22:16:37.826728 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:16:48 crc kubenswrapper[4739]: I1008 22:16:48.822752 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:16:48 crc kubenswrapper[4739]: E1008 22:16:48.823843 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.064880 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-x9jwq"] Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.083483 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wwm4z"] Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.093382 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lvj7x"] Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.102745 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wwm4z"] Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.113186 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lvj7x"] Oct 08 22:16:58 crc kubenswrapper[4739]: I1008 22:16:58.121774 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-x9jwq"] Oct 08 22:16:59 crc kubenswrapper[4739]: I1008 22:16:59.844094 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742f0e7b-0ccd-4f1e-83ae-027d75053522" path="/var/lib/kubelet/pods/742f0e7b-0ccd-4f1e-83ae-027d75053522/volumes" Oct 08 22:16:59 crc kubenswrapper[4739]: I1008 22:16:59.846315 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f9b47a-3b20-488b-93c6-ca8ca9beb2eb" path="/var/lib/kubelet/pods/88f9b47a-3b20-488b-93c6-ca8ca9beb2eb/volumes" Oct 08 22:16:59 crc kubenswrapper[4739]: I1008 22:16:59.847056 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6294838-8381-41e1-9384-0084edf1dac0" path="/var/lib/kubelet/pods/f6294838-8381-41e1-9384-0084edf1dac0/volumes" Oct 08 22:16:59 crc kubenswrapper[4739]: I1008 22:16:59.865752 4739 generic.go:334] "Generic (PLEG): container finished" podID="bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" containerID="c685c74601c88d67752f7157b318779d89753e38f08eb7aee814c1c3b0c8bc65" exitCode=0 Oct 08 22:16:59 crc kubenswrapper[4739]: I1008 22:16:59.865869 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" event={"ID":"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b","Type":"ContainerDied","Data":"c685c74601c88d67752f7157b318779d89753e38f08eb7aee814c1c3b0c8bc65"} Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.524747 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.668893 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpdsv\" (UniqueName: \"kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv\") pod \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.669145 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key\") pod \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.669466 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle\") pod \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.669601 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory\") pod \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\" (UID: \"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b\") " Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.677230 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" (UID: "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.677622 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv" (OuterVolumeSpecName: "kube-api-access-kpdsv") pod "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" (UID: "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b"). InnerVolumeSpecName "kube-api-access-kpdsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.700305 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory" (OuterVolumeSpecName: "inventory") pod "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" (UID: "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.711693 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" (UID: "bd1f9e00-5ba4-4aa0-b38c-8610f396af0b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.773809 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.773868 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpdsv\" (UniqueName: \"kubernetes.io/projected/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-kube-api-access-kpdsv\") on node \"crc\" DevicePath \"\"" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.773880 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.773894 4739 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f9e00-5ba4-4aa0-b38c-8610f396af0b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.919454 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" event={"ID":"bd1f9e00-5ba4-4aa0-b38c-8610f396af0b","Type":"ContainerDied","Data":"8a871f9d2b48f4290d7393ad920e68b34ab49cca9c8659058e2cdd9ba9e15ed8"} Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.919526 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a871f9d2b48f4290d7393ad920e68b34ab49cca9c8659058e2cdd9ba9e15ed8" Oct 08 22:17:01 crc kubenswrapper[4739]: I1008 22:17:01.919666 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.003470 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9"] Oct 08 22:17:02 crc kubenswrapper[4739]: E1008 22:17:02.004371 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004397 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 22:17:02 crc kubenswrapper[4739]: E1008 22:17:02.004410 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="extract-content" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004419 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="extract-content" Oct 08 22:17:02 crc kubenswrapper[4739]: E1008 22:17:02.004455 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="registry-server" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004464 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="registry-server" Oct 08 22:17:02 crc kubenswrapper[4739]: E1008 22:17:02.004487 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="extract-utilities" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004495 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="extract-utilities" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004752 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1f9e00-5ba4-4aa0-b38c-8610f396af0b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.004854 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3156d56b-ba07-438c-91a5-21d0cd0848d5" containerName="registry-server" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.006007 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.010104 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.012059 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.012074 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.012976 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9"] Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.015319 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.083538 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdng2\" (UniqueName: \"kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.083599 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.083678 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.185958 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdng2\" (UniqueName: \"kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.186036 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.186140 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.192419 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.193525 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.226916 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdng2\" (UniqueName: \"kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.333292 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.821499 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:17:02 crc kubenswrapper[4739]: E1008 22:17:02.821986 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:17:02 crc kubenswrapper[4739]: I1008 22:17:02.933146 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9"] Oct 08 22:17:03 crc kubenswrapper[4739]: I1008 22:17:03.947674 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" event={"ID":"b62c229c-107a-42de-8501-b52ae4c47f9f","Type":"ContainerStarted","Data":"49e31e1274aab1ca3cabaa37502d0274c964b61995c84aa22ca2c68eb27e872f"} Oct 08 22:17:03 crc kubenswrapper[4739]: I1008 22:17:03.948135 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" event={"ID":"b62c229c-107a-42de-8501-b52ae4c47f9f","Type":"ContainerStarted","Data":"de69190929d2c466465b5eab42e1a3caeb8708ac83dc5e2eb9bc096800674902"} Oct 08 22:17:03 crc kubenswrapper[4739]: I1008 22:17:03.968038 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" podStartSLOduration=2.364734367 podStartE2EDuration="2.968011582s" podCreationTimestamp="2025-10-08 22:17:01 +0000 UTC" firstStartedPulling="2025-10-08 22:17:02.94329886 +0000 UTC m=+1722.768684610" lastFinishedPulling="2025-10-08 22:17:03.546576055 +0000 UTC m=+1723.371961825" observedRunningTime="2025-10-08 22:17:03.963929261 +0000 UTC m=+1723.789315011" watchObservedRunningTime="2025-10-08 22:17:03.968011582 +0000 UTC m=+1723.793397332" Oct 08 22:17:07 crc kubenswrapper[4739]: I1008 22:17:07.049839 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8be7-account-create-q4j26"] Oct 08 22:17:07 crc kubenswrapper[4739]: I1008 22:17:07.068230 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8be7-account-create-q4j26"] Oct 08 22:17:07 crc kubenswrapper[4739]: I1008 22:17:07.844310 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e2fed8-546a-4d0f-bb94-381bb1d4bd18" path="/var/lib/kubelet/pods/d4e2fed8-546a-4d0f-bb94-381bb1d4bd18/volumes" Oct 08 22:17:13 crc kubenswrapper[4739]: I1008 22:17:13.822689 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:17:13 crc kubenswrapper[4739]: E1008 22:17:13.824286 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:17:17 crc kubenswrapper[4739]: I1008 22:17:17.070020 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1026-account-create-p8z6l"] Oct 08 22:17:17 crc kubenswrapper[4739]: I1008 22:17:17.088745 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1026-account-create-p8z6l"] Oct 08 22:17:17 crc kubenswrapper[4739]: I1008 22:17:17.851318 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec4887c-3f0f-4ad2-a187-ecf79caa824f" path="/var/lib/kubelet/pods/1ec4887c-3f0f-4ad2-a187-ecf79caa824f/volumes" Oct 08 22:17:24 crc kubenswrapper[4739]: I1008 22:17:24.042305 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1291-account-create-zsqws"] Oct 08 22:17:24 crc kubenswrapper[4739]: I1008 22:17:24.053750 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1291-account-create-zsqws"] Oct 08 22:17:25 crc kubenswrapper[4739]: I1008 22:17:25.841532 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d" path="/var/lib/kubelet/pods/7f84a6b5-5c1a-45a4-9b6b-5216f4fe5a6d/volumes" Oct 08 22:17:28 crc kubenswrapper[4739]: I1008 22:17:28.822306 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:17:28 crc kubenswrapper[4739]: E1008 22:17:28.824683 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.033383 4739 scope.go:117] "RemoveContainer" containerID="c2edb8a23dd8e4a6d4d868aede065fdaf60e67f07924a6c44c27fdfd63ce97c1" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.081827 4739 scope.go:117] "RemoveContainer" containerID="a6c66b3dec97437d2e5a4f0be4dac9e3c4bb3e7627cc41666afaa825b2768bad" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.143857 4739 scope.go:117] "RemoveContainer" containerID="49d4dfd8262ce588a05f8cab369e73f96045ff2b6e128dc4fb7342a6486fc134" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.202533 4739 scope.go:117] "RemoveContainer" containerID="a79b6943eb60b6351feb24e91006cd341881c19d90644416102de4b20d764ffb" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.288364 4739 scope.go:117] "RemoveContainer" containerID="8e040d0fe1f33fac07a8d169396c70327f6a0603999eb0e13bdeb7e61e1240a6" Oct 08 22:17:30 crc kubenswrapper[4739]: I1008 22:17:30.319628 4739 scope.go:117] "RemoveContainer" containerID="3e85e74fdcce96ea3da43342900d7c79964611a77ff3ec1634dfe655fad9615c" Oct 08 22:17:33 crc kubenswrapper[4739]: I1008 22:17:33.047757 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-t45jh"] Oct 08 22:17:33 crc kubenswrapper[4739]: I1008 22:17:33.056851 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-t45jh"] Oct 08 22:17:33 crc kubenswrapper[4739]: I1008 22:17:33.835111 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06663718-278d-4ac8-b6a5-9a6141dc0f78" path="/var/lib/kubelet/pods/06663718-278d-4ac8-b6a5-9a6141dc0f78/volumes" Oct 08 22:17:34 crc kubenswrapper[4739]: I1008 22:17:34.029879 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h86jm"] Oct 08 22:17:34 crc kubenswrapper[4739]: I1008 22:17:34.038714 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lsg8k"] Oct 08 22:17:34 crc kubenswrapper[4739]: I1008 22:17:34.051129 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lsg8k"] Oct 08 22:17:34 crc kubenswrapper[4739]: I1008 22:17:34.063157 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h86jm"] Oct 08 22:17:35 crc kubenswrapper[4739]: I1008 22:17:35.834044 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25582fa4-e567-498f-ae5c-b05ebb260645" path="/var/lib/kubelet/pods/25582fa4-e567-498f-ae5c-b05ebb260645/volumes" Oct 08 22:17:35 crc kubenswrapper[4739]: I1008 22:17:35.835598 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33840e62-c56d-41a7-918d-377ce6e86ffe" path="/var/lib/kubelet/pods/33840e62-c56d-41a7-918d-377ce6e86ffe/volumes" Oct 08 22:17:38 crc kubenswrapper[4739]: I1008 22:17:38.044246 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-254hj"] Oct 08 22:17:38 crc kubenswrapper[4739]: I1008 22:17:38.051708 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-254hj"] Oct 08 22:17:39 crc kubenswrapper[4739]: I1008 22:17:39.821619 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:17:39 crc kubenswrapper[4739]: E1008 22:17:39.822340 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:17:39 crc kubenswrapper[4739]: I1008 22:17:39.834538 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f94af6-8eda-46ff-be13-dc11b2f52790" path="/var/lib/kubelet/pods/b8f94af6-8eda-46ff-be13-dc11b2f52790/volumes" Oct 08 22:17:43 crc kubenswrapper[4739]: I1008 22:17:43.042901 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xnxrs"] Oct 08 22:17:43 crc kubenswrapper[4739]: I1008 22:17:43.053600 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xnxrs"] Oct 08 22:17:43 crc kubenswrapper[4739]: I1008 22:17:43.839759 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b37b73-be70-445c-9515-c3c467ce9bf9" path="/var/lib/kubelet/pods/30b37b73-be70-445c-9515-c3c467ce9bf9/volumes" Oct 08 22:17:44 crc kubenswrapper[4739]: I1008 22:17:44.036114 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1624-account-create-qr8gs"] Oct 08 22:17:44 crc kubenswrapper[4739]: I1008 22:17:44.047614 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ad94-account-create-2wctp"] Oct 08 22:17:44 crc kubenswrapper[4739]: I1008 22:17:44.059270 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ad94-account-create-2wctp"] Oct 08 22:17:44 crc kubenswrapper[4739]: I1008 22:17:44.069720 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1624-account-create-qr8gs"] Oct 08 22:17:45 crc kubenswrapper[4739]: I1008 22:17:45.054220 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5994-account-create-cf2zj"] Oct 08 22:17:45 crc kubenswrapper[4739]: I1008 22:17:45.059572 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5994-account-create-cf2zj"] Oct 08 22:17:45 crc kubenswrapper[4739]: I1008 22:17:45.839654 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55547779-fb79-423f-8af3-09c41da8b357" path="/var/lib/kubelet/pods/55547779-fb79-423f-8af3-09c41da8b357/volumes" Oct 08 22:17:45 crc kubenswrapper[4739]: I1008 22:17:45.843070 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5562904-ea97-4bdb-89e6-46d2c316f29c" path="/var/lib/kubelet/pods/a5562904-ea97-4bdb-89e6-46d2c316f29c/volumes" Oct 08 22:17:45 crc kubenswrapper[4739]: I1008 22:17:45.844906 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74a95fb-685b-46c3-8727-2b87d78607a5" path="/var/lib/kubelet/pods/d74a95fb-685b-46c3-8727-2b87d78607a5/volumes" Oct 08 22:17:52 crc kubenswrapper[4739]: I1008 22:17:52.822391 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:17:52 crc kubenswrapper[4739]: E1008 22:17:52.823679 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:18:03 crc kubenswrapper[4739]: I1008 22:18:03.822373 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:18:03 crc kubenswrapper[4739]: E1008 22:18:03.823806 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:18:16 crc kubenswrapper[4739]: I1008 22:18:16.071194 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-674m2"] Oct 08 22:18:16 crc kubenswrapper[4739]: I1008 22:18:16.082757 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-674m2"] Oct 08 22:18:17 crc kubenswrapper[4739]: I1008 22:18:17.068085 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dtlhn"] Oct 08 22:18:17 crc kubenswrapper[4739]: I1008 22:18:17.083751 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dtlhn"] Oct 08 22:18:17 crc kubenswrapper[4739]: I1008 22:18:17.822783 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:18:17 crc kubenswrapper[4739]: E1008 22:18:17.824807 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:18:17 crc kubenswrapper[4739]: I1008 22:18:17.851780 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6fd1196-cd2f-4951-ad50-5dc17dac4aac" path="/var/lib/kubelet/pods/b6fd1196-cd2f-4951-ad50-5dc17dac4aac/volumes" Oct 08 22:18:17 crc kubenswrapper[4739]: I1008 22:18:17.852963 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe3031d0-ec15-4a2e-b635-a067472da71d" path="/var/lib/kubelet/pods/fe3031d0-ec15-4a2e-b635-a067472da71d/volumes" Oct 08 22:18:20 crc kubenswrapper[4739]: I1008 22:18:20.047530 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b64hw"] Oct 08 22:18:20 crc kubenswrapper[4739]: I1008 22:18:20.060076 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b64hw"] Oct 08 22:18:21 crc kubenswrapper[4739]: I1008 22:18:21.850940 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1381344-e404-4d04-bd00-667cfc882bcc" path="/var/lib/kubelet/pods/b1381344-e404-4d04-bd00-667cfc882bcc/volumes" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.050370 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q5vhf"] Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.068261 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q5vhf"] Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.532628 4739 scope.go:117] "RemoveContainer" containerID="4b4879a81be993503d200383d3ec9c930dfac787a06552dcd7bb7c0c2df16dba" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.574503 4739 scope.go:117] "RemoveContainer" containerID="87d921c5a5e28362e5f9afe03e6e2b9602216697772fca22e8114178ec2bd7e4" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.660011 4739 scope.go:117] "RemoveContainer" containerID="6a1da5aac8382f24cd67defbce27b21b4bb7f3a8a050ab044b7776203829f4a4" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.720775 4739 scope.go:117] "RemoveContainer" containerID="ecaea620f32062a4add85f8e9e2142342a4fa7694e7647baabab0ac192b03702" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.808946 4739 scope.go:117] "RemoveContainer" containerID="61bb7913c178e8bcb7aaefaeb4c1d09cb5216243f9c2fd38a870b3d36c7b4076" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.834697 4739 scope.go:117] "RemoveContainer" containerID="e0caa62d8003ed5f6c166b455a89ac239b00aee43ad29b2ef54ce2ab075d3b0d" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.913251 4739 scope.go:117] "RemoveContainer" containerID="06fea3b454341d8f5561cabc5f21fff6dacdb61ed4c3f7b9dd60fb90eb29fc0b" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.949211 4739 scope.go:117] "RemoveContainer" containerID="31d526f78f7e9f5f32b87e4175245eff5b19970c45fdb4ba3de34243dab95054" Oct 08 22:18:30 crc kubenswrapper[4739]: I1008 22:18:30.985549 4739 scope.go:117] "RemoveContainer" containerID="66b4a62c9c64a2bc503f2dbb75bbcf99a260bd0f616fd1709f2984c414b52c6c" Oct 08 22:18:31 crc kubenswrapper[4739]: I1008 22:18:31.020794 4739 scope.go:117] "RemoveContainer" containerID="71923097a6d8ba915352bd825bb242038db20087c1bde2ca7d54b389e0be526d" Oct 08 22:18:31 crc kubenswrapper[4739]: I1008 22:18:31.046241 4739 scope.go:117] "RemoveContainer" containerID="4fab8ceeef55c6fea097a95f4f116c2edba112b3163e825235f450e332f97f26" Oct 08 22:18:31 crc kubenswrapper[4739]: I1008 22:18:31.845697 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836f20c4-8401-4a21-a541-0dbc92430484" path="/var/lib/kubelet/pods/836f20c4-8401-4a21-a541-0dbc92430484/volumes" Oct 08 22:18:32 crc kubenswrapper[4739]: I1008 22:18:32.821944 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:18:32 crc kubenswrapper[4739]: E1008 22:18:32.822544 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:18:44 crc kubenswrapper[4739]: I1008 22:18:44.822697 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:18:44 crc kubenswrapper[4739]: E1008 22:18:44.824045 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:18:56 crc kubenswrapper[4739]: I1008 22:18:56.053391 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vvmhx"] Oct 08 22:18:56 crc kubenswrapper[4739]: I1008 22:18:56.064383 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vvmhx"] Oct 08 22:18:57 crc kubenswrapper[4739]: I1008 22:18:57.838415 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da3e49a-b4ae-4375-893f-47d64b4eb0b5" path="/var/lib/kubelet/pods/4da3e49a-b4ae-4375-893f-47d64b4eb0b5/volumes" Oct 08 22:18:59 crc kubenswrapper[4739]: I1008 22:18:59.830596 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:18:59 crc kubenswrapper[4739]: E1008 22:18:59.832881 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:19:04 crc kubenswrapper[4739]: I1008 22:19:04.708058 4739 generic.go:334] "Generic (PLEG): container finished" podID="b62c229c-107a-42de-8501-b52ae4c47f9f" containerID="49e31e1274aab1ca3cabaa37502d0274c964b61995c84aa22ca2c68eb27e872f" exitCode=0 Oct 08 22:19:04 crc kubenswrapper[4739]: I1008 22:19:04.708163 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" event={"ID":"b62c229c-107a-42de-8501-b52ae4c47f9f","Type":"ContainerDied","Data":"49e31e1274aab1ca3cabaa37502d0274c964b61995c84aa22ca2c68eb27e872f"} Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.266874 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.375004 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key\") pod \"b62c229c-107a-42de-8501-b52ae4c47f9f\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.375128 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdng2\" (UniqueName: \"kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2\") pod \"b62c229c-107a-42de-8501-b52ae4c47f9f\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.375289 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory\") pod \"b62c229c-107a-42de-8501-b52ae4c47f9f\" (UID: \"b62c229c-107a-42de-8501-b52ae4c47f9f\") " Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.387503 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2" (OuterVolumeSpecName: "kube-api-access-mdng2") pod "b62c229c-107a-42de-8501-b52ae4c47f9f" (UID: "b62c229c-107a-42de-8501-b52ae4c47f9f"). InnerVolumeSpecName "kube-api-access-mdng2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.415492 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b62c229c-107a-42de-8501-b52ae4c47f9f" (UID: "b62c229c-107a-42de-8501-b52ae4c47f9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.440110 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory" (OuterVolumeSpecName: "inventory") pod "b62c229c-107a-42de-8501-b52ae4c47f9f" (UID: "b62c229c-107a-42de-8501-b52ae4c47f9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.478180 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdng2\" (UniqueName: \"kubernetes.io/projected/b62c229c-107a-42de-8501-b52ae4c47f9f-kube-api-access-mdng2\") on node \"crc\" DevicePath \"\"" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.478233 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.478252 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b62c229c-107a-42de-8501-b52ae4c47f9f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.729512 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" event={"ID":"b62c229c-107a-42de-8501-b52ae4c47f9f","Type":"ContainerDied","Data":"de69190929d2c466465b5eab42e1a3caeb8708ac83dc5e2eb9bc096800674902"} Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.729885 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de69190929d2c466465b5eab42e1a3caeb8708ac83dc5e2eb9bc096800674902" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.729573 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.833580 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg"] Oct 08 22:19:06 crc kubenswrapper[4739]: E1008 22:19:06.834064 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62c229c-107a-42de-8501-b52ae4c47f9f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.834085 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62c229c-107a-42de-8501-b52ae4c47f9f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.834308 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62c229c-107a-42de-8501-b52ae4c47f9f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.835110 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.837825 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.838078 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.838187 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.838370 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.847108 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg"] Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.886319 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.886395 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.886728 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x29l\" (UniqueName: \"kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.989218 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x29l\" (UniqueName: \"kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.989428 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.989485 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.994210 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:06 crc kubenswrapper[4739]: I1008 22:19:06.996294 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:07 crc kubenswrapper[4739]: I1008 22:19:07.008374 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x29l\" (UniqueName: \"kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:07 crc kubenswrapper[4739]: I1008 22:19:07.152898 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:19:07 crc kubenswrapper[4739]: I1008 22:19:07.711440 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg"] Oct 08 22:19:07 crc kubenswrapper[4739]: I1008 22:19:07.740400 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" event={"ID":"b87a15bb-7744-4904-91b9-9f8052912033","Type":"ContainerStarted","Data":"28503a284f585f0938caa0fc3b8705523d6695feab581ffab7832b8b7f6356f8"} Oct 08 22:19:09 crc kubenswrapper[4739]: I1008 22:19:09.785819 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" event={"ID":"b87a15bb-7744-4904-91b9-9f8052912033","Type":"ContainerStarted","Data":"57ff6cb9c24a1520befbd020f686a3651a30d7bc06f661e712c5c7021f85ae76"} Oct 08 22:19:09 crc kubenswrapper[4739]: I1008 22:19:09.815095 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" podStartSLOduration=2.854070257 podStartE2EDuration="3.815068054s" podCreationTimestamp="2025-10-08 22:19:06 +0000 UTC" firstStartedPulling="2025-10-08 22:19:07.715524605 +0000 UTC m=+1847.540910365" lastFinishedPulling="2025-10-08 22:19:08.676522402 +0000 UTC m=+1848.501908162" observedRunningTime="2025-10-08 22:19:09.806796321 +0000 UTC m=+1849.632182101" watchObservedRunningTime="2025-10-08 22:19:09.815068054 +0000 UTC m=+1849.640453844" Oct 08 22:19:12 crc kubenswrapper[4739]: I1008 22:19:12.821557 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:19:12 crc kubenswrapper[4739]: E1008 22:19:12.822147 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.065067 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5pbsk"] Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.077713 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-89kb6"] Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.091454 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vx5hx"] Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.114606 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-89kb6"] Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.114690 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5pbsk"] Oct 08 22:19:16 crc kubenswrapper[4739]: I1008 22:19:16.119528 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vx5hx"] Oct 08 22:19:17 crc kubenswrapper[4739]: I1008 22:19:17.866580 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442932da-356f-4a98-97a8-59ee1418fd24" path="/var/lib/kubelet/pods/442932da-356f-4a98-97a8-59ee1418fd24/volumes" Oct 08 22:19:17 crc kubenswrapper[4739]: I1008 22:19:17.868473 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705882a0-d0a4-490f-9377-d9b379c5a9ea" path="/var/lib/kubelet/pods/705882a0-d0a4-490f-9377-d9b379c5a9ea/volumes" Oct 08 22:19:17 crc kubenswrapper[4739]: I1008 22:19:17.868966 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67308b3-9336-448a-9399-7db66f43b5aa" path="/var/lib/kubelet/pods/d67308b3-9336-448a-9399-7db66f43b5aa/volumes" Oct 08 22:19:24 crc kubenswrapper[4739]: I1008 22:19:24.821870 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:19:24 crc kubenswrapper[4739]: E1008 22:19:24.822951 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:19:31 crc kubenswrapper[4739]: I1008 22:19:31.387078 4739 scope.go:117] "RemoveContainer" containerID="3cf36dda52dda7ca4486cb1db258872c2e392941ea164018080edbb6c6580e16" Oct 08 22:19:31 crc kubenswrapper[4739]: I1008 22:19:31.461423 4739 scope.go:117] "RemoveContainer" containerID="0c00db920a38489ac2087c86988347b6e291b2ea19110d1b86678d226ffe3a1e" Oct 08 22:19:31 crc kubenswrapper[4739]: I1008 22:19:31.506761 4739 scope.go:117] "RemoveContainer" containerID="b19346be300cfdbed834b66c2d55b6e0c3ab28352d9bf5313453e5fef1fbc2d1" Oct 08 22:19:31 crc kubenswrapper[4739]: I1008 22:19:31.566282 4739 scope.go:117] "RemoveContainer" containerID="9ad019e7acc4f9d8d37fd4b8c1e574900a10d7353202e4edf792b27dd2d49818" Oct 08 22:19:31 crc kubenswrapper[4739]: I1008 22:19:31.610921 4739 scope.go:117] "RemoveContainer" containerID="cfaea9716ef442a82d7d1442e5293c3a41541f51fe6a306462de76aad7abb370" Oct 08 22:19:35 crc kubenswrapper[4739]: I1008 22:19:35.822026 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:19:35 crc kubenswrapper[4739]: E1008 22:19:35.823441 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.058344 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-24d3-account-create-t6jbj"] Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.070112 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-76e4-account-create-w4hrz"] Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.080102 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ec7f-account-create-xrvtt"] Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.092510 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-76e4-account-create-w4hrz"] Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.101059 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ec7f-account-create-xrvtt"] Oct 08 22:19:36 crc kubenswrapper[4739]: I1008 22:19:36.108909 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-24d3-account-create-t6jbj"] Oct 08 22:19:37 crc kubenswrapper[4739]: I1008 22:19:37.839543 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050eba89-d876-415d-a4bf-fa42c414ec27" path="/var/lib/kubelet/pods/050eba89-d876-415d-a4bf-fa42c414ec27/volumes" Oct 08 22:19:37 crc kubenswrapper[4739]: I1008 22:19:37.841003 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf" path="/var/lib/kubelet/pods/96a2a1ce-b4cb-4dfd-8571-cd87bb386fbf/volumes" Oct 08 22:19:37 crc kubenswrapper[4739]: I1008 22:19:37.842075 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd658aa2-8067-4957-b4c1-3d3be9f6496b" path="/var/lib/kubelet/pods/fd658aa2-8067-4957-b4c1-3d3be9f6496b/volumes" Oct 08 22:19:50 crc kubenswrapper[4739]: I1008 22:19:50.821674 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:19:50 crc kubenswrapper[4739]: E1008 22:19:50.822526 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:20:03 crc kubenswrapper[4739]: I1008 22:20:03.823475 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:20:03 crc kubenswrapper[4739]: E1008 22:20:03.825051 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:20:09 crc kubenswrapper[4739]: I1008 22:20:09.051205 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nwcm7"] Oct 08 22:20:09 crc kubenswrapper[4739]: I1008 22:20:09.058871 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nwcm7"] Oct 08 22:20:09 crc kubenswrapper[4739]: I1008 22:20:09.835433 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8cd516-b57f-4dc7-913f-fca9eac68452" path="/var/lib/kubelet/pods/cd8cd516-b57f-4dc7-913f-fca9eac68452/volumes" Oct 08 22:20:16 crc kubenswrapper[4739]: I1008 22:20:16.822584 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:20:16 crc kubenswrapper[4739]: E1008 22:20:16.823783 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:20:27 crc kubenswrapper[4739]: I1008 22:20:27.822962 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:20:27 crc kubenswrapper[4739]: E1008 22:20:27.824808 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:20:31 crc kubenswrapper[4739]: E1008 22:20:31.455216 4739 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87a15bb_7744_4904_91b9_9f8052912033.slice/crio-conmon-57ff6cb9c24a1520befbd020f686a3651a30d7bc06f661e712c5c7021f85ae76.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:20:31 crc kubenswrapper[4739]: I1008 22:20:31.868507 4739 scope.go:117] "RemoveContainer" containerID="3117d288a49c0224c3098e216ddd7692185e694c28f0f6824d569cadf09ce0b3" Oct 08 22:20:31 crc kubenswrapper[4739]: I1008 22:20:31.905286 4739 generic.go:334] "Generic (PLEG): container finished" podID="b87a15bb-7744-4904-91b9-9f8052912033" containerID="57ff6cb9c24a1520befbd020f686a3651a30d7bc06f661e712c5c7021f85ae76" exitCode=0 Oct 08 22:20:31 crc kubenswrapper[4739]: I1008 22:20:31.905374 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" event={"ID":"b87a15bb-7744-4904-91b9-9f8052912033","Type":"ContainerDied","Data":"57ff6cb9c24a1520befbd020f686a3651a30d7bc06f661e712c5c7021f85ae76"} Oct 08 22:20:31 crc kubenswrapper[4739]: I1008 22:20:31.920784 4739 scope.go:117] "RemoveContainer" containerID="0872ab24d84212478d8f517f197aac3df2b001a3a2bd7f8c3dff426124a0a309" Oct 08 22:20:32 crc kubenswrapper[4739]: I1008 22:20:31.998791 4739 scope.go:117] "RemoveContainer" containerID="2a6a4c7fda63a61e43323201d4ec7a829e757e5bbe9d0a1e0a52359cdd793563" Oct 08 22:20:32 crc kubenswrapper[4739]: I1008 22:20:32.110990 4739 scope.go:117] "RemoveContainer" containerID="d7274f341767e00ac52c996486b9c9ef091cc12452f25d07fbe10fb0334914fc" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.094523 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8m5qj"] Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.110199 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8m5qj"] Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.433991 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.535856 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key\") pod \"b87a15bb-7744-4904-91b9-9f8052912033\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.535997 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory\") pod \"b87a15bb-7744-4904-91b9-9f8052912033\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.536098 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x29l\" (UniqueName: \"kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l\") pod \"b87a15bb-7744-4904-91b9-9f8052912033\" (UID: \"b87a15bb-7744-4904-91b9-9f8052912033\") " Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.559639 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l" (OuterVolumeSpecName: "kube-api-access-7x29l") pod "b87a15bb-7744-4904-91b9-9f8052912033" (UID: "b87a15bb-7744-4904-91b9-9f8052912033"). InnerVolumeSpecName "kube-api-access-7x29l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.588701 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b87a15bb-7744-4904-91b9-9f8052912033" (UID: "b87a15bb-7744-4904-91b9-9f8052912033"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.591157 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory" (OuterVolumeSpecName: "inventory") pod "b87a15bb-7744-4904-91b9-9f8052912033" (UID: "b87a15bb-7744-4904-91b9-9f8052912033"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.638460 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x29l\" (UniqueName: \"kubernetes.io/projected/b87a15bb-7744-4904-91b9-9f8052912033-kube-api-access-7x29l\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.638582 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.638602 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b87a15bb-7744-4904-91b9-9f8052912033-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.833441 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546d5b3a-342d-44f7-a179-12724fc783d0" path="/var/lib/kubelet/pods/546d5b3a-342d-44f7-a179-12724fc783d0/volumes" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.938669 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" event={"ID":"b87a15bb-7744-4904-91b9-9f8052912033","Type":"ContainerDied","Data":"28503a284f585f0938caa0fc3b8705523d6695feab581ffab7832b8b7f6356f8"} Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.938984 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28503a284f585f0938caa0fc3b8705523d6695feab581ffab7832b8b7f6356f8" Oct 08 22:20:33 crc kubenswrapper[4739]: I1008 22:20:33.938723 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.038702 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb"] Oct 08 22:20:34 crc kubenswrapper[4739]: E1008 22:20:34.039440 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87a15bb-7744-4904-91b9-9f8052912033" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.039519 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87a15bb-7744-4904-91b9-9f8052912033" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.039777 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87a15bb-7744-4904-91b9-9f8052912033" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.040865 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.047616 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.047633 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.048193 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb"] Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.049349 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.051618 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.170924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.170994 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rch\" (UniqueName: \"kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.171046 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.273622 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.273814 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.273855 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rch\" (UniqueName: \"kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.283552 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.288894 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.299706 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rch\" (UniqueName: \"kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.368499 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.735450 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb"] Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.745177 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:20:34 crc kubenswrapper[4739]: I1008 22:20:34.954395 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" event={"ID":"4b344b99-c3a5-4d79-ad85-b8589d6489b0","Type":"ContainerStarted","Data":"067b8516d00701c4e3813d500a08be784dd72a3b024b59cad4e3245465a30b24"} Oct 08 22:20:36 crc kubenswrapper[4739]: I1008 22:20:36.978579 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" event={"ID":"4b344b99-c3a5-4d79-ad85-b8589d6489b0","Type":"ContainerStarted","Data":"3644d6f08ab25d989ca047295d555c96942cdb74cb85ebbad34efd7b95764a9b"} Oct 08 22:20:37 crc kubenswrapper[4739]: I1008 22:20:37.011330 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" podStartSLOduration=1.9893143819999999 podStartE2EDuration="3.011300524s" podCreationTimestamp="2025-10-08 22:20:34 +0000 UTC" firstStartedPulling="2025-10-08 22:20:34.744766279 +0000 UTC m=+1934.570152069" lastFinishedPulling="2025-10-08 22:20:35.766752441 +0000 UTC m=+1935.592138211" observedRunningTime="2025-10-08 22:20:36.99973003 +0000 UTC m=+1936.825115810" watchObservedRunningTime="2025-10-08 22:20:37.011300524 +0000 UTC m=+1936.836686294" Oct 08 22:20:38 crc kubenswrapper[4739]: I1008 22:20:38.823771 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:20:38 crc kubenswrapper[4739]: E1008 22:20:38.824646 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:20:42 crc kubenswrapper[4739]: I1008 22:20:42.046874 4739 generic.go:334] "Generic (PLEG): container finished" podID="4b344b99-c3a5-4d79-ad85-b8589d6489b0" containerID="3644d6f08ab25d989ca047295d555c96942cdb74cb85ebbad34efd7b95764a9b" exitCode=0 Oct 08 22:20:42 crc kubenswrapper[4739]: I1008 22:20:42.046940 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" event={"ID":"4b344b99-c3a5-4d79-ad85-b8589d6489b0","Type":"ContainerDied","Data":"3644d6f08ab25d989ca047295d555c96942cdb74cb85ebbad34efd7b95764a9b"} Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.482847 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.595499 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key\") pod \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.595871 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rch\" (UniqueName: \"kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch\") pod \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.595946 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory\") pod \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\" (UID: \"4b344b99-c3a5-4d79-ad85-b8589d6489b0\") " Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.605115 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch" (OuterVolumeSpecName: "kube-api-access-56rch") pod "4b344b99-c3a5-4d79-ad85-b8589d6489b0" (UID: "4b344b99-c3a5-4d79-ad85-b8589d6489b0"). InnerVolumeSpecName "kube-api-access-56rch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.630285 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory" (OuterVolumeSpecName: "inventory") pod "4b344b99-c3a5-4d79-ad85-b8589d6489b0" (UID: "4b344b99-c3a5-4d79-ad85-b8589d6489b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.633809 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b344b99-c3a5-4d79-ad85-b8589d6489b0" (UID: "4b344b99-c3a5-4d79-ad85-b8589d6489b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.700309 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.700350 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rch\" (UniqueName: \"kubernetes.io/projected/4b344b99-c3a5-4d79-ad85-b8589d6489b0-kube-api-access-56rch\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:43 crc kubenswrapper[4739]: I1008 22:20:43.700363 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b344b99-c3a5-4d79-ad85-b8589d6489b0-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.089192 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" event={"ID":"4b344b99-c3a5-4d79-ad85-b8589d6489b0","Type":"ContainerDied","Data":"067b8516d00701c4e3813d500a08be784dd72a3b024b59cad4e3245465a30b24"} Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.089614 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067b8516d00701c4e3813d500a08be784dd72a3b024b59cad4e3245465a30b24" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.089719 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.161647 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb"] Oct 08 22:20:44 crc kubenswrapper[4739]: E1008 22:20:44.162072 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b344b99-c3a5-4d79-ad85-b8589d6489b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.162092 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b344b99-c3a5-4d79-ad85-b8589d6489b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.162346 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b344b99-c3a5-4d79-ad85-b8589d6489b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.163082 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.166918 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.167001 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.167089 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.172022 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.179079 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb"] Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.212013 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckj8n\" (UniqueName: \"kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.212322 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.212388 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.313714 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckj8n\" (UniqueName: \"kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.314061 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.314286 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.321047 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.321119 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.341372 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckj8n\" (UniqueName: \"kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kldb\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:44 crc kubenswrapper[4739]: I1008 22:20:44.480095 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:20:45 crc kubenswrapper[4739]: I1008 22:20:45.098369 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb"] Oct 08 22:20:46 crc kubenswrapper[4739]: I1008 22:20:46.126986 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" event={"ID":"9b1fcab8-e84d-433d-ac57-62a00dc6f557","Type":"ContainerStarted","Data":"894c95bcf931c65bfa0c797e99ae9a0acd38701e7ca3dbc254e9fa56b85cebef"} Oct 08 22:20:46 crc kubenswrapper[4739]: I1008 22:20:46.128185 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" event={"ID":"9b1fcab8-e84d-433d-ac57-62a00dc6f557","Type":"ContainerStarted","Data":"b612928d1127caf472a6ba5a1b435cd035eee788a80d27dc3bbccfea4e927996"} Oct 08 22:20:46 crc kubenswrapper[4739]: I1008 22:20:46.156415 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" podStartSLOduration=1.728919435 podStartE2EDuration="2.156388242s" podCreationTimestamp="2025-10-08 22:20:44 +0000 UTC" firstStartedPulling="2025-10-08 22:20:45.110125544 +0000 UTC m=+1944.935511304" lastFinishedPulling="2025-10-08 22:20:45.537594321 +0000 UTC m=+1945.362980111" observedRunningTime="2025-10-08 22:20:46.14650912 +0000 UTC m=+1945.971894870" watchObservedRunningTime="2025-10-08 22:20:46.156388242 +0000 UTC m=+1945.981774002" Oct 08 22:20:49 crc kubenswrapper[4739]: I1008 22:20:49.056666 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5zvg9"] Oct 08 22:20:49 crc kubenswrapper[4739]: I1008 22:20:49.074466 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5zvg9"] Oct 08 22:20:49 crc kubenswrapper[4739]: I1008 22:20:49.838045 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a90847-544d-45f9-b1c1-862b13309b66" path="/var/lib/kubelet/pods/b4a90847-544d-45f9-b1c1-862b13309b66/volumes" Oct 08 22:20:52 crc kubenswrapper[4739]: I1008 22:20:52.822698 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:20:52 crc kubenswrapper[4739]: E1008 22:20:52.825391 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:21:07 crc kubenswrapper[4739]: I1008 22:21:07.822072 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:21:07 crc kubenswrapper[4739]: E1008 22:21:07.823720 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:21:18 crc kubenswrapper[4739]: I1008 22:21:18.822368 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:21:18 crc kubenswrapper[4739]: E1008 22:21:18.823763 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:21:23 crc kubenswrapper[4739]: I1008 22:21:23.061434 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fjj6g"] Oct 08 22:21:23 crc kubenswrapper[4739]: I1008 22:21:23.073014 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fjj6g"] Oct 08 22:21:23 crc kubenswrapper[4739]: I1008 22:21:23.839163 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27fb5e7-8d1e-4afb-8a82-86b4619bf330" path="/var/lib/kubelet/pods/e27fb5e7-8d1e-4afb-8a82-86b4619bf330/volumes" Oct 08 22:21:26 crc kubenswrapper[4739]: I1008 22:21:26.630681 4739 generic.go:334] "Generic (PLEG): container finished" podID="9b1fcab8-e84d-433d-ac57-62a00dc6f557" containerID="894c95bcf931c65bfa0c797e99ae9a0acd38701e7ca3dbc254e9fa56b85cebef" exitCode=0 Oct 08 22:21:26 crc kubenswrapper[4739]: I1008 22:21:26.630772 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" event={"ID":"9b1fcab8-e84d-433d-ac57-62a00dc6f557","Type":"ContainerDied","Data":"894c95bcf931c65bfa0c797e99ae9a0acd38701e7ca3dbc254e9fa56b85cebef"} Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.210085 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.358369 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key\") pod \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.358616 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory\") pod \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.358755 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckj8n\" (UniqueName: \"kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n\") pod \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\" (UID: \"9b1fcab8-e84d-433d-ac57-62a00dc6f557\") " Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.367616 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n" (OuterVolumeSpecName: "kube-api-access-ckj8n") pod "9b1fcab8-e84d-433d-ac57-62a00dc6f557" (UID: "9b1fcab8-e84d-433d-ac57-62a00dc6f557"). InnerVolumeSpecName "kube-api-access-ckj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.405685 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b1fcab8-e84d-433d-ac57-62a00dc6f557" (UID: "9b1fcab8-e84d-433d-ac57-62a00dc6f557"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.417458 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory" (OuterVolumeSpecName: "inventory") pod "9b1fcab8-e84d-433d-ac57-62a00dc6f557" (UID: "9b1fcab8-e84d-433d-ac57-62a00dc6f557"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.462553 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.462641 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b1fcab8-e84d-433d-ac57-62a00dc6f557-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.462663 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckj8n\" (UniqueName: \"kubernetes.io/projected/9b1fcab8-e84d-433d-ac57-62a00dc6f557-kube-api-access-ckj8n\") on node \"crc\" DevicePath \"\"" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.662252 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" event={"ID":"9b1fcab8-e84d-433d-ac57-62a00dc6f557","Type":"ContainerDied","Data":"b612928d1127caf472a6ba5a1b435cd035eee788a80d27dc3bbccfea4e927996"} Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.662354 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b612928d1127caf472a6ba5a1b435cd035eee788a80d27dc3bbccfea4e927996" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.662383 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kldb" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.763927 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2"] Oct 08 22:21:28 crc kubenswrapper[4739]: E1008 22:21:28.764411 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1fcab8-e84d-433d-ac57-62a00dc6f557" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.764434 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1fcab8-e84d-433d-ac57-62a00dc6f557" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.764812 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1fcab8-e84d-433d-ac57-62a00dc6f557" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.765715 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.768352 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.769657 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.770221 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.770498 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.783981 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2"] Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.870961 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.871053 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96cqj\" (UniqueName: \"kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.871122 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.973982 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.974203 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96cqj\" (UniqueName: \"kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.974412 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.980289 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:28 crc kubenswrapper[4739]: I1008 22:21:28.981052 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:29 crc kubenswrapper[4739]: I1008 22:21:29.003957 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96cqj\" (UniqueName: \"kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-82kg2\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:29 crc kubenswrapper[4739]: I1008 22:21:29.136702 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:21:29 crc kubenswrapper[4739]: I1008 22:21:29.574393 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2"] Oct 08 22:21:29 crc kubenswrapper[4739]: I1008 22:21:29.680765 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" event={"ID":"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f","Type":"ContainerStarted","Data":"27106d0fb3d343ecee6612eb3135f1806be311d7e01940753fe45895a454c64a"} Oct 08 22:21:29 crc kubenswrapper[4739]: I1008 22:21:29.823175 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:21:30 crc kubenswrapper[4739]: I1008 22:21:30.706821 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9"} Oct 08 22:21:30 crc kubenswrapper[4739]: I1008 22:21:30.717642 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" event={"ID":"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f","Type":"ContainerStarted","Data":"6b9be949690f0066931b08d0f378973f54149deb0d47878c77eae32f903912ca"} Oct 08 22:21:30 crc kubenswrapper[4739]: I1008 22:21:30.776992 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" podStartSLOduration=2.259340983 podStartE2EDuration="2.776967145s" podCreationTimestamp="2025-10-08 22:21:28 +0000 UTC" firstStartedPulling="2025-10-08 22:21:29.586298832 +0000 UTC m=+1989.411684612" lastFinishedPulling="2025-10-08 22:21:30.103925024 +0000 UTC m=+1989.929310774" observedRunningTime="2025-10-08 22:21:30.768965998 +0000 UTC m=+1990.594351758" watchObservedRunningTime="2025-10-08 22:21:30.776967145 +0000 UTC m=+1990.602352915" Oct 08 22:21:32 crc kubenswrapper[4739]: I1008 22:21:32.245634 4739 scope.go:117] "RemoveContainer" containerID="5086c7809928752cf03517b21200024f71f295cfb4e5f17e1ab8a62bcf86a04f" Oct 08 22:21:32 crc kubenswrapper[4739]: I1008 22:21:32.338848 4739 scope.go:117] "RemoveContainer" containerID="b10729a7a20b72d90fb78c97dba4e910de8cb486ab4fbe792405c20d9906207c" Oct 08 22:21:32 crc kubenswrapper[4739]: I1008 22:21:32.385881 4739 scope.go:117] "RemoveContainer" containerID="ac36c2acfc0d70ec251af785a08a99591027cca49a1d0bbc24a22c0fdc93bdff" Oct 08 22:21:49 crc kubenswrapper[4739]: I1008 22:21:49.921285 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:21:49 crc kubenswrapper[4739]: I1008 22:21:49.924030 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:49 crc kubenswrapper[4739]: I1008 22:21:49.964341 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.062274 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv58p\" (UniqueName: \"kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.062879 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.062961 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.165449 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.165521 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.165619 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv58p\" (UniqueName: \"kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.166238 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.166293 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.192037 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv58p\" (UniqueName: \"kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p\") pod \"redhat-operators-llbzv\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.252770 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.756983 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:21:50 crc kubenswrapper[4739]: W1008 22:21:50.768312 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf405c872_684c_4033_9a16_8c26e8787df7.slice/crio-9ccffdfc78165f9b1212d3f37f8421a36682a1d75a3320532a701f657a7177cc WatchSource:0}: Error finding container 9ccffdfc78165f9b1212d3f37f8421a36682a1d75a3320532a701f657a7177cc: Status 404 returned error can't find the container with id 9ccffdfc78165f9b1212d3f37f8421a36682a1d75a3320532a701f657a7177cc Oct 08 22:21:50 crc kubenswrapper[4739]: I1008 22:21:50.965255 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerStarted","Data":"9ccffdfc78165f9b1212d3f37f8421a36682a1d75a3320532a701f657a7177cc"} Oct 08 22:21:51 crc kubenswrapper[4739]: I1008 22:21:51.983593 4739 generic.go:334] "Generic (PLEG): container finished" podID="f405c872-684c-4033-9a16-8c26e8787df7" containerID="98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954" exitCode=0 Oct 08 22:21:51 crc kubenswrapper[4739]: I1008 22:21:51.983672 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerDied","Data":"98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954"} Oct 08 22:21:53 crc kubenswrapper[4739]: I1008 22:21:53.003067 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerStarted","Data":"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2"} Oct 08 22:21:55 crc kubenswrapper[4739]: I1008 22:21:55.035638 4739 generic.go:334] "Generic (PLEG): container finished" podID="f405c872-684c-4033-9a16-8c26e8787df7" containerID="f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2" exitCode=0 Oct 08 22:21:55 crc kubenswrapper[4739]: I1008 22:21:55.036400 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerDied","Data":"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2"} Oct 08 22:21:57 crc kubenswrapper[4739]: I1008 22:21:57.068705 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerStarted","Data":"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7"} Oct 08 22:21:57 crc kubenswrapper[4739]: I1008 22:21:57.123211 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-llbzv" podStartSLOduration=4.149116432 podStartE2EDuration="8.123133774s" podCreationTimestamp="2025-10-08 22:21:49 +0000 UTC" firstStartedPulling="2025-10-08 22:21:51.986777817 +0000 UTC m=+2011.812163607" lastFinishedPulling="2025-10-08 22:21:55.960795159 +0000 UTC m=+2015.786180949" observedRunningTime="2025-10-08 22:21:57.101266725 +0000 UTC m=+2016.926652505" watchObservedRunningTime="2025-10-08 22:21:57.123133774 +0000 UTC m=+2016.948519574" Oct 08 22:22:00 crc kubenswrapper[4739]: I1008 22:22:00.253062 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:00 crc kubenswrapper[4739]: I1008 22:22:00.253849 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:01 crc kubenswrapper[4739]: I1008 22:22:01.330082 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llbzv" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="registry-server" probeResult="failure" output=< Oct 08 22:22:01 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:22:01 crc kubenswrapper[4739]: > Oct 08 22:22:10 crc kubenswrapper[4739]: I1008 22:22:10.324294 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:10 crc kubenswrapper[4739]: I1008 22:22:10.419716 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:10 crc kubenswrapper[4739]: I1008 22:22:10.578126 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.271787 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-llbzv" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="registry-server" containerID="cri-o://a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7" gracePeriod=2 Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.866261 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.894958 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content\") pod \"f405c872-684c-4033-9a16-8c26e8787df7\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.895075 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities\") pod \"f405c872-684c-4033-9a16-8c26e8787df7\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.895271 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv58p\" (UniqueName: \"kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p\") pod \"f405c872-684c-4033-9a16-8c26e8787df7\" (UID: \"f405c872-684c-4033-9a16-8c26e8787df7\") " Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.949773 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities" (OuterVolumeSpecName: "utilities") pod "f405c872-684c-4033-9a16-8c26e8787df7" (UID: "f405c872-684c-4033-9a16-8c26e8787df7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.954728 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p" (OuterVolumeSpecName: "kube-api-access-qv58p") pod "f405c872-684c-4033-9a16-8c26e8787df7" (UID: "f405c872-684c-4033-9a16-8c26e8787df7"). InnerVolumeSpecName "kube-api-access-qv58p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.997936 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv58p\" (UniqueName: \"kubernetes.io/projected/f405c872-684c-4033-9a16-8c26e8787df7-kube-api-access-qv58p\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:12 crc kubenswrapper[4739]: I1008 22:22:12.997975 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.054071 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f405c872-684c-4033-9a16-8c26e8787df7" (UID: "f405c872-684c-4033-9a16-8c26e8787df7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.100672 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f405c872-684c-4033-9a16-8c26e8787df7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.288427 4739 generic.go:334] "Generic (PLEG): container finished" podID="f405c872-684c-4033-9a16-8c26e8787df7" containerID="a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7" exitCode=0 Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.288547 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerDied","Data":"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7"} Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.288632 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llbzv" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.289693 4739 scope.go:117] "RemoveContainer" containerID="a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.289578 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llbzv" event={"ID":"f405c872-684c-4033-9a16-8c26e8787df7","Type":"ContainerDied","Data":"9ccffdfc78165f9b1212d3f37f8421a36682a1d75a3320532a701f657a7177cc"} Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.322505 4739 scope.go:117] "RemoveContainer" containerID="f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.353099 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.361004 4739 scope.go:117] "RemoveContainer" containerID="98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.363608 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-llbzv"] Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.388116 4739 scope.go:117] "RemoveContainer" containerID="a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7" Oct 08 22:22:13 crc kubenswrapper[4739]: E1008 22:22:13.388751 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7\": container with ID starting with a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7 not found: ID does not exist" containerID="a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.388832 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7"} err="failed to get container status \"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7\": rpc error: code = NotFound desc = could not find container \"a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7\": container with ID starting with a1b670b0ebcca5b42443101e1aef384382edd674681231b03f3bf82063beb1f7 not found: ID does not exist" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.388892 4739 scope.go:117] "RemoveContainer" containerID="f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2" Oct 08 22:22:13 crc kubenswrapper[4739]: E1008 22:22:13.389229 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2\": container with ID starting with f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2 not found: ID does not exist" containerID="f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.389276 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2"} err="failed to get container status \"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2\": rpc error: code = NotFound desc = could not find container \"f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2\": container with ID starting with f39f9e29111ada718712532cb49c9ad8522b214e49df59773f5eb7887983c5c2 not found: ID does not exist" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.389308 4739 scope.go:117] "RemoveContainer" containerID="98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954" Oct 08 22:22:13 crc kubenswrapper[4739]: E1008 22:22:13.389793 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954\": container with ID starting with 98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954 not found: ID does not exist" containerID="98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.389822 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954"} err="failed to get container status \"98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954\": rpc error: code = NotFound desc = could not find container \"98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954\": container with ID starting with 98d776b189b04a26ef2474197455b29c521ee605dbec23c97c7334f48cc60954 not found: ID does not exist" Oct 08 22:22:13 crc kubenswrapper[4739]: I1008 22:22:13.835790 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f405c872-684c-4033-9a16-8c26e8787df7" path="/var/lib/kubelet/pods/f405c872-684c-4033-9a16-8c26e8787df7/volumes" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.565669 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:22 crc kubenswrapper[4739]: E1008 22:22:22.566950 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="extract-content" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.566974 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="extract-content" Oct 08 22:22:22 crc kubenswrapper[4739]: E1008 22:22:22.567009 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="registry-server" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.567020 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="registry-server" Oct 08 22:22:22 crc kubenswrapper[4739]: E1008 22:22:22.567066 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="extract-utilities" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.567079 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="extract-utilities" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.567361 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="f405c872-684c-4033-9a16-8c26e8787df7" containerName="registry-server" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.569444 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.587194 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.638361 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.638433 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p84j6\" (UniqueName: \"kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.638664 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.740512 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.740596 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p84j6\" (UniqueName: \"kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.740740 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.741396 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.741406 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.779062 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p84j6\" (UniqueName: \"kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6\") pod \"certified-operators-fznhr\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:22 crc kubenswrapper[4739]: I1008 22:22:22.899080 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:23 crc kubenswrapper[4739]: I1008 22:22:23.523647 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:24 crc kubenswrapper[4739]: I1008 22:22:24.429257 4739 generic.go:334] "Generic (PLEG): container finished" podID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerID="a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf" exitCode=0 Oct 08 22:22:24 crc kubenswrapper[4739]: I1008 22:22:24.429327 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerDied","Data":"a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf"} Oct 08 22:22:24 crc kubenswrapper[4739]: I1008 22:22:24.429623 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerStarted","Data":"829993bb51bb34a843383716d3b322cc1e0879fd8609ca29188a00465639a53b"} Oct 08 22:22:26 crc kubenswrapper[4739]: I1008 22:22:26.462572 4739 generic.go:334] "Generic (PLEG): container finished" podID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerID="fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f" exitCode=0 Oct 08 22:22:26 crc kubenswrapper[4739]: I1008 22:22:26.462689 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerDied","Data":"fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f"} Oct 08 22:22:27 crc kubenswrapper[4739]: I1008 22:22:27.479130 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerStarted","Data":"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099"} Oct 08 22:22:27 crc kubenswrapper[4739]: I1008 22:22:27.514903 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fznhr" podStartSLOduration=3.04692404 podStartE2EDuration="5.514880609s" podCreationTimestamp="2025-10-08 22:22:22 +0000 UTC" firstStartedPulling="2025-10-08 22:22:24.432062903 +0000 UTC m=+2044.257448643" lastFinishedPulling="2025-10-08 22:22:26.900019452 +0000 UTC m=+2046.725405212" observedRunningTime="2025-10-08 22:22:27.505737784 +0000 UTC m=+2047.331123554" watchObservedRunningTime="2025-10-08 22:22:27.514880609 +0000 UTC m=+2047.340266369" Oct 08 22:22:32 crc kubenswrapper[4739]: I1008 22:22:32.534949 4739 generic.go:334] "Generic (PLEG): container finished" podID="23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" containerID="6b9be949690f0066931b08d0f378973f54149deb0d47878c77eae32f903912ca" exitCode=2 Oct 08 22:22:32 crc kubenswrapper[4739]: I1008 22:22:32.535103 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" event={"ID":"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f","Type":"ContainerDied","Data":"6b9be949690f0066931b08d0f378973f54149deb0d47878c77eae32f903912ca"} Oct 08 22:22:32 crc kubenswrapper[4739]: I1008 22:22:32.900031 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:32 crc kubenswrapper[4739]: I1008 22:22:32.900791 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:32 crc kubenswrapper[4739]: I1008 22:22:32.964496 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:33 crc kubenswrapper[4739]: I1008 22:22:33.627545 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:33 crc kubenswrapper[4739]: I1008 22:22:33.696576 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.072096 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.222389 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key\") pod \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.222485 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory\") pod \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.222770 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96cqj\" (UniqueName: \"kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj\") pod \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\" (UID: \"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f\") " Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.230321 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj" (OuterVolumeSpecName: "kube-api-access-96cqj") pod "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" (UID: "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f"). InnerVolumeSpecName "kube-api-access-96cqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.260206 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory" (OuterVolumeSpecName: "inventory") pod "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" (UID: "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.265304 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" (UID: "23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.325233 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96cqj\" (UniqueName: \"kubernetes.io/projected/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-kube-api-access-96cqj\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.325272 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.325282 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.563480 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" event={"ID":"23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f","Type":"ContainerDied","Data":"27106d0fb3d343ecee6612eb3135f1806be311d7e01940753fe45895a454c64a"} Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.563548 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27106d0fb3d343ecee6612eb3135f1806be311d7e01940753fe45895a454c64a" Oct 08 22:22:34 crc kubenswrapper[4739]: I1008 22:22:34.564207 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-82kg2" Oct 08 22:22:35 crc kubenswrapper[4739]: I1008 22:22:35.576566 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fznhr" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="registry-server" containerID="cri-o://becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099" gracePeriod=2 Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.045736 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.174306 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities\") pod \"344bac1d-e709-4a96-8937-c4d8ab11b00e\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.174434 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content\") pod \"344bac1d-e709-4a96-8937-c4d8ab11b00e\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.174687 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p84j6\" (UniqueName: \"kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6\") pod \"344bac1d-e709-4a96-8937-c4d8ab11b00e\" (UID: \"344bac1d-e709-4a96-8937-c4d8ab11b00e\") " Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.175830 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities" (OuterVolumeSpecName: "utilities") pod "344bac1d-e709-4a96-8937-c4d8ab11b00e" (UID: "344bac1d-e709-4a96-8937-c4d8ab11b00e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.183865 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6" (OuterVolumeSpecName: "kube-api-access-p84j6") pod "344bac1d-e709-4a96-8937-c4d8ab11b00e" (UID: "344bac1d-e709-4a96-8937-c4d8ab11b00e"). InnerVolumeSpecName "kube-api-access-p84j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.277258 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.277300 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p84j6\" (UniqueName: \"kubernetes.io/projected/344bac1d-e709-4a96-8937-c4d8ab11b00e-kube-api-access-p84j6\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.380648 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "344bac1d-e709-4a96-8937-c4d8ab11b00e" (UID: "344bac1d-e709-4a96-8937-c4d8ab11b00e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.482702 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344bac1d-e709-4a96-8937-c4d8ab11b00e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.586654 4739 generic.go:334] "Generic (PLEG): container finished" podID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerID="becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099" exitCode=0 Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.586731 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fznhr" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.586735 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerDied","Data":"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099"} Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.586881 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fznhr" event={"ID":"344bac1d-e709-4a96-8937-c4d8ab11b00e","Type":"ContainerDied","Data":"829993bb51bb34a843383716d3b322cc1e0879fd8609ca29188a00465639a53b"} Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.586919 4739 scope.go:117] "RemoveContainer" containerID="becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.611446 4739 scope.go:117] "RemoveContainer" containerID="fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.634606 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.642804 4739 scope.go:117] "RemoveContainer" containerID="a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.648049 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fznhr"] Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.684450 4739 scope.go:117] "RemoveContainer" containerID="becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099" Oct 08 22:22:36 crc kubenswrapper[4739]: E1008 22:22:36.685501 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099\": container with ID starting with becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099 not found: ID does not exist" containerID="becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.685550 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099"} err="failed to get container status \"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099\": rpc error: code = NotFound desc = could not find container \"becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099\": container with ID starting with becbf3f241f030855e6984270921741e9e7ef51f300fe47e1da3e8219baff099 not found: ID does not exist" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.685581 4739 scope.go:117] "RemoveContainer" containerID="fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f" Oct 08 22:22:36 crc kubenswrapper[4739]: E1008 22:22:36.685941 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f\": container with ID starting with fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f not found: ID does not exist" containerID="fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.685974 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f"} err="failed to get container status \"fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f\": rpc error: code = NotFound desc = could not find container \"fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f\": container with ID starting with fe0d4ba16237bca556598cc4a44d29039e88218448c09c8b3f08061cf772182f not found: ID does not exist" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.685990 4739 scope.go:117] "RemoveContainer" containerID="a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf" Oct 08 22:22:36 crc kubenswrapper[4739]: E1008 22:22:36.686228 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf\": container with ID starting with a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf not found: ID does not exist" containerID="a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf" Oct 08 22:22:36 crc kubenswrapper[4739]: I1008 22:22:36.686248 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf"} err="failed to get container status \"a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf\": rpc error: code = NotFound desc = could not find container \"a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf\": container with ID starting with a724a0b1510c634458594b0bcc5d73ca08b87a7de328e8527bef5dd2d325fcbf not found: ID does not exist" Oct 08 22:22:37 crc kubenswrapper[4739]: I1008 22:22:37.842290 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" path="/var/lib/kubelet/pods/344bac1d-e709-4a96-8937-c4d8ab11b00e/volumes" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.037200 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd"] Oct 08 22:22:41 crc kubenswrapper[4739]: E1008 22:22:41.039619 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="extract-utilities" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.039754 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="extract-utilities" Oct 08 22:22:41 crc kubenswrapper[4739]: E1008 22:22:41.039851 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="extract-content" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.039956 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="extract-content" Oct 08 22:22:41 crc kubenswrapper[4739]: E1008 22:22:41.040050 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="registry-server" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.040128 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="registry-server" Oct 08 22:22:41 crc kubenswrapper[4739]: E1008 22:22:41.040723 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.040844 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.041208 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="344bac1d-e709-4a96-8937-c4d8ab11b00e" containerName="registry-server" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.041392 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.042594 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.046009 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.050290 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.050629 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.050966 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.069074 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd"] Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.185006 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.185062 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.185475 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7b6d\" (UniqueName: \"kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.287975 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.288087 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.288241 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7b6d\" (UniqueName: \"kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.294342 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.294952 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.317344 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7b6d\" (UniqueName: \"kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grlzd\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:41 crc kubenswrapper[4739]: I1008 22:22:41.362926 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:22:42 crc kubenswrapper[4739]: I1008 22:22:42.018529 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd"] Oct 08 22:22:42 crc kubenswrapper[4739]: I1008 22:22:42.665215 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" event={"ID":"a3f415ab-75ff-469e-84f4-5d2e9f4053e2","Type":"ContainerStarted","Data":"64283d2f0e0518b467dc88a12e07a17f32c338585798e19c08063be99da56cc1"} Oct 08 22:22:43 crc kubenswrapper[4739]: I1008 22:22:43.679787 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" event={"ID":"a3f415ab-75ff-469e-84f4-5d2e9f4053e2","Type":"ContainerStarted","Data":"dd46152531f46474131bcc94ec71e794751f0d4dc3821874447b3cb56f728901"} Oct 08 22:22:43 crc kubenswrapper[4739]: I1008 22:22:43.717039 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" podStartSLOduration=2.219391225 podStartE2EDuration="2.717008995s" podCreationTimestamp="2025-10-08 22:22:41 +0000 UTC" firstStartedPulling="2025-10-08 22:22:41.99289525 +0000 UTC m=+2061.818281000" lastFinishedPulling="2025-10-08 22:22:42.49051298 +0000 UTC m=+2062.315898770" observedRunningTime="2025-10-08 22:22:43.705970183 +0000 UTC m=+2063.531355943" watchObservedRunningTime="2025-10-08 22:22:43.717008995 +0000 UTC m=+2063.542394785" Oct 08 22:23:36 crc kubenswrapper[4739]: I1008 22:23:36.358870 4739 generic.go:334] "Generic (PLEG): container finished" podID="a3f415ab-75ff-469e-84f4-5d2e9f4053e2" containerID="dd46152531f46474131bcc94ec71e794751f0d4dc3821874447b3cb56f728901" exitCode=0 Oct 08 22:23:36 crc kubenswrapper[4739]: I1008 22:23:36.358996 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" event={"ID":"a3f415ab-75ff-469e-84f4-5d2e9f4053e2","Type":"ContainerDied","Data":"dd46152531f46474131bcc94ec71e794751f0d4dc3821874447b3cb56f728901"} Oct 08 22:23:37 crc kubenswrapper[4739]: I1008 22:23:37.897471 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.034050 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key\") pod \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.034751 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7b6d\" (UniqueName: \"kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d\") pod \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.035160 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory\") pod \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\" (UID: \"a3f415ab-75ff-469e-84f4-5d2e9f4053e2\") " Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.042718 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d" (OuterVolumeSpecName: "kube-api-access-q7b6d") pod "a3f415ab-75ff-469e-84f4-5d2e9f4053e2" (UID: "a3f415ab-75ff-469e-84f4-5d2e9f4053e2"). InnerVolumeSpecName "kube-api-access-q7b6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.073527 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory" (OuterVolumeSpecName: "inventory") pod "a3f415ab-75ff-469e-84f4-5d2e9f4053e2" (UID: "a3f415ab-75ff-469e-84f4-5d2e9f4053e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.078510 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3f415ab-75ff-469e-84f4-5d2e9f4053e2" (UID: "a3f415ab-75ff-469e-84f4-5d2e9f4053e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.137793 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7b6d\" (UniqueName: \"kubernetes.io/projected/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-kube-api-access-q7b6d\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.137842 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.137852 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3f415ab-75ff-469e-84f4-5d2e9f4053e2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.383561 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" event={"ID":"a3f415ab-75ff-469e-84f4-5d2e9f4053e2","Type":"ContainerDied","Data":"64283d2f0e0518b467dc88a12e07a17f32c338585798e19c08063be99da56cc1"} Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.383612 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64283d2f0e0518b467dc88a12e07a17f32c338585798e19c08063be99da56cc1" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.383682 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grlzd" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.555223 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p2w6r"] Oct 08 22:23:38 crc kubenswrapper[4739]: E1008 22:23:38.557289 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f415ab-75ff-469e-84f4-5d2e9f4053e2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.557321 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f415ab-75ff-469e-84f4-5d2e9f4053e2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.559715 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f415ab-75ff-469e-84f4-5d2e9f4053e2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.561123 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.574298 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.574466 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.574488 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.574645 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.575771 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p2w6r"] Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.661548 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.662355 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjj4s\" (UniqueName: \"kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.662482 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.765230 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.765581 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjj4s\" (UniqueName: \"kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.766308 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.773462 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.773616 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.789435 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjj4s\" (UniqueName: \"kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s\") pod \"ssh-known-hosts-edpm-deployment-p2w6r\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:38 crc kubenswrapper[4739]: I1008 22:23:38.900763 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:39 crc kubenswrapper[4739]: I1008 22:23:39.480048 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p2w6r"] Oct 08 22:23:40 crc kubenswrapper[4739]: I1008 22:23:40.404874 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" event={"ID":"3b70364c-a814-4e53-afb8-693faa5063ec","Type":"ContainerStarted","Data":"9d91b2a4878daffde039d347520c6bcbbb60d5f24705806eca2e7bdec5258f8b"} Oct 08 22:23:40 crc kubenswrapper[4739]: I1008 22:23:40.405376 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" event={"ID":"3b70364c-a814-4e53-afb8-693faa5063ec","Type":"ContainerStarted","Data":"0e06ef236d55d20d7efb9d34c34e013d84e782e87233116a90a5df59bcd9e133"} Oct 08 22:23:48 crc kubenswrapper[4739]: I1008 22:23:48.504193 4739 generic.go:334] "Generic (PLEG): container finished" podID="3b70364c-a814-4e53-afb8-693faa5063ec" containerID="9d91b2a4878daffde039d347520c6bcbbb60d5f24705806eca2e7bdec5258f8b" exitCode=0 Oct 08 22:23:48 crc kubenswrapper[4739]: I1008 22:23:48.504350 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" event={"ID":"3b70364c-a814-4e53-afb8-693faa5063ec","Type":"ContainerDied","Data":"9d91b2a4878daffde039d347520c6bcbbb60d5f24705806eca2e7bdec5258f8b"} Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.033427 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.135125 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0\") pod \"3b70364c-a814-4e53-afb8-693faa5063ec\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.135344 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam\") pod \"3b70364c-a814-4e53-afb8-693faa5063ec\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.135641 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjj4s\" (UniqueName: \"kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s\") pod \"3b70364c-a814-4e53-afb8-693faa5063ec\" (UID: \"3b70364c-a814-4e53-afb8-693faa5063ec\") " Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.143285 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s" (OuterVolumeSpecName: "kube-api-access-wjj4s") pod "3b70364c-a814-4e53-afb8-693faa5063ec" (UID: "3b70364c-a814-4e53-afb8-693faa5063ec"). InnerVolumeSpecName "kube-api-access-wjj4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.164992 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3b70364c-a814-4e53-afb8-693faa5063ec" (UID: "3b70364c-a814-4e53-afb8-693faa5063ec"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.180508 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b70364c-a814-4e53-afb8-693faa5063ec" (UID: "3b70364c-a814-4e53-afb8-693faa5063ec"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.238620 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjj4s\" (UniqueName: \"kubernetes.io/projected/3b70364c-a814-4e53-afb8-693faa5063ec-kube-api-access-wjj4s\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.238771 4739 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.238858 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b70364c-a814-4e53-afb8-693faa5063ec-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.530613 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" event={"ID":"3b70364c-a814-4e53-afb8-693faa5063ec","Type":"ContainerDied","Data":"0e06ef236d55d20d7efb9d34c34e013d84e782e87233116a90a5df59bcd9e133"} Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.530670 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e06ef236d55d20d7efb9d34c34e013d84e782e87233116a90a5df59bcd9e133" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.530745 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p2w6r" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.640884 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs"] Oct 08 22:23:50 crc kubenswrapper[4739]: E1008 22:23:50.641336 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b70364c-a814-4e53-afb8-693faa5063ec" containerName="ssh-known-hosts-edpm-deployment" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.641357 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b70364c-a814-4e53-afb8-693faa5063ec" containerName="ssh-known-hosts-edpm-deployment" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.641588 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b70364c-a814-4e53-afb8-693faa5063ec" containerName="ssh-known-hosts-edpm-deployment" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.642343 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.646853 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.647257 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.647637 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.648201 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.664918 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs"] Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.751391 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.751552 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.751604 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvgb\" (UniqueName: \"kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.873401 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.873887 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvgb\" (UniqueName: \"kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.874024 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.879898 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.883139 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.908632 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvgb\" (UniqueName: \"kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8nvgs\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:50 crc kubenswrapper[4739]: I1008 22:23:50.967034 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:23:51 crc kubenswrapper[4739]: I1008 22:23:51.481336 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs"] Oct 08 22:23:51 crc kubenswrapper[4739]: I1008 22:23:51.541628 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" event={"ID":"373291e0-4568-47e1-a71f-b2f005e5e557","Type":"ContainerStarted","Data":"c6ff38a263aa047022079828b4913418b8ca9f0824be332a398790dd61a4a9ce"} Oct 08 22:23:51 crc kubenswrapper[4739]: I1008 22:23:51.767115 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:23:51 crc kubenswrapper[4739]: I1008 22:23:51.767247 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:23:53 crc kubenswrapper[4739]: I1008 22:23:53.578390 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" event={"ID":"373291e0-4568-47e1-a71f-b2f005e5e557","Type":"ContainerStarted","Data":"f02773a8f63a78f099056e5f430f11e151093090da9d18fb99996ff1aa9147b8"} Oct 08 22:23:53 crc kubenswrapper[4739]: I1008 22:23:53.619160 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" podStartSLOduration=2.436724976 podStartE2EDuration="3.619127125s" podCreationTimestamp="2025-10-08 22:23:50 +0000 UTC" firstStartedPulling="2025-10-08 22:23:51.48625031 +0000 UTC m=+2131.311636070" lastFinishedPulling="2025-10-08 22:23:52.668652429 +0000 UTC m=+2132.494038219" observedRunningTime="2025-10-08 22:23:53.601682515 +0000 UTC m=+2133.427068285" watchObservedRunningTime="2025-10-08 22:23:53.619127125 +0000 UTC m=+2133.444512875" Oct 08 22:24:01 crc kubenswrapper[4739]: I1008 22:24:01.659125 4739 generic.go:334] "Generic (PLEG): container finished" podID="373291e0-4568-47e1-a71f-b2f005e5e557" containerID="f02773a8f63a78f099056e5f430f11e151093090da9d18fb99996ff1aa9147b8" exitCode=0 Oct 08 22:24:01 crc kubenswrapper[4739]: I1008 22:24:01.659232 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" event={"ID":"373291e0-4568-47e1-a71f-b2f005e5e557","Type":"ContainerDied","Data":"f02773a8f63a78f099056e5f430f11e151093090da9d18fb99996ff1aa9147b8"} Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.111359 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.251276 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbvgb\" (UniqueName: \"kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb\") pod \"373291e0-4568-47e1-a71f-b2f005e5e557\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.251737 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory\") pod \"373291e0-4568-47e1-a71f-b2f005e5e557\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.251839 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key\") pod \"373291e0-4568-47e1-a71f-b2f005e5e557\" (UID: \"373291e0-4568-47e1-a71f-b2f005e5e557\") " Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.263510 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb" (OuterVolumeSpecName: "kube-api-access-dbvgb") pod "373291e0-4568-47e1-a71f-b2f005e5e557" (UID: "373291e0-4568-47e1-a71f-b2f005e5e557"). InnerVolumeSpecName "kube-api-access-dbvgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.287287 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "373291e0-4568-47e1-a71f-b2f005e5e557" (UID: "373291e0-4568-47e1-a71f-b2f005e5e557"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.293175 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory" (OuterVolumeSpecName: "inventory") pod "373291e0-4568-47e1-a71f-b2f005e5e557" (UID: "373291e0-4568-47e1-a71f-b2f005e5e557"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.355044 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbvgb\" (UniqueName: \"kubernetes.io/projected/373291e0-4568-47e1-a71f-b2f005e5e557-kube-api-access-dbvgb\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.355086 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.355098 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/373291e0-4568-47e1-a71f-b2f005e5e557-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.687595 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" event={"ID":"373291e0-4568-47e1-a71f-b2f005e5e557","Type":"ContainerDied","Data":"c6ff38a263aa047022079828b4913418b8ca9f0824be332a398790dd61a4a9ce"} Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.687660 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ff38a263aa047022079828b4913418b8ca9f0824be332a398790dd61a4a9ce" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.687965 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8nvgs" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.761853 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48"] Oct 08 22:24:03 crc kubenswrapper[4739]: E1008 22:24:03.762545 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373291e0-4568-47e1-a71f-b2f005e5e557" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.762572 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="373291e0-4568-47e1-a71f-b2f005e5e557" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.762929 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="373291e0-4568-47e1-a71f-b2f005e5e557" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.763950 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.766892 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.767264 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.768110 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.768345 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.771912 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48"] Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.865884 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.866058 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.866183 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgxqw\" (UniqueName: \"kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.969438 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.969551 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.969656 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgxqw\" (UniqueName: \"kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.978765 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.979313 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:03 crc kubenswrapper[4739]: I1008 22:24:03.991544 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgxqw\" (UniqueName: \"kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:04 crc kubenswrapper[4739]: I1008 22:24:04.094124 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:24:04 crc kubenswrapper[4739]: I1008 22:24:04.631631 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48"] Oct 08 22:24:04 crc kubenswrapper[4739]: I1008 22:24:04.697772 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" event={"ID":"8ad88d67-b089-4777-be25-7c61f66c18c7","Type":"ContainerStarted","Data":"3408613e6083556b8999a19a203c4e48f445c0f1d0d0d2f5537a9785acdb4183"} Oct 08 22:24:05 crc kubenswrapper[4739]: I1008 22:24:05.707669 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" event={"ID":"8ad88d67-b089-4777-be25-7c61f66c18c7","Type":"ContainerStarted","Data":"cdecbd328b54dac432f2c6e99e8af22a767ffe23d812f822b0a6ca3cc5ff548c"} Oct 08 22:24:05 crc kubenswrapper[4739]: I1008 22:24:05.733011 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" podStartSLOduration=2.27295388 podStartE2EDuration="2.732989154s" podCreationTimestamp="2025-10-08 22:24:03 +0000 UTC" firstStartedPulling="2025-10-08 22:24:04.643185656 +0000 UTC m=+2144.468571446" lastFinishedPulling="2025-10-08 22:24:05.10322092 +0000 UTC m=+2144.928606720" observedRunningTime="2025-10-08 22:24:05.724606118 +0000 UTC m=+2145.549991878" watchObservedRunningTime="2025-10-08 22:24:05.732989154 +0000 UTC m=+2145.558374904" Oct 08 22:24:21 crc kubenswrapper[4739]: I1008 22:24:21.766879 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:24:21 crc kubenswrapper[4739]: I1008 22:24:21.767411 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.511541 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.520137 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.529761 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.643364 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.643664 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.643790 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7jn\" (UniqueName: \"kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.745653 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7jn\" (UniqueName: \"kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.745805 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.745832 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.746342 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.746924 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.774442 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7jn\" (UniqueName: \"kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn\") pod \"redhat-marketplace-mzqjb\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:23 crc kubenswrapper[4739]: I1008 22:24:23.845814 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:24 crc kubenswrapper[4739]: I1008 22:24:24.391725 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:24 crc kubenswrapper[4739]: W1008 22:24:24.395774 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889a1840_b560_4536_8f9a_a63a5c218c3a.slice/crio-e2cfa2e91cdfe9c3b20218486edfc629a2c689a12807a2d351a494287c7eeb78 WatchSource:0}: Error finding container e2cfa2e91cdfe9c3b20218486edfc629a2c689a12807a2d351a494287c7eeb78: Status 404 returned error can't find the container with id e2cfa2e91cdfe9c3b20218486edfc629a2c689a12807a2d351a494287c7eeb78 Oct 08 22:24:24 crc kubenswrapper[4739]: I1008 22:24:24.916476 4739 generic.go:334] "Generic (PLEG): container finished" podID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerID="14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf" exitCode=0 Oct 08 22:24:24 crc kubenswrapper[4739]: I1008 22:24:24.916572 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerDied","Data":"14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf"} Oct 08 22:24:24 crc kubenswrapper[4739]: I1008 22:24:24.916771 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerStarted","Data":"e2cfa2e91cdfe9c3b20218486edfc629a2c689a12807a2d351a494287c7eeb78"} Oct 08 22:24:26 crc kubenswrapper[4739]: I1008 22:24:26.940454 4739 generic.go:334] "Generic (PLEG): container finished" podID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerID="dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac" exitCode=0 Oct 08 22:24:26 crc kubenswrapper[4739]: I1008 22:24:26.941065 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerDied","Data":"dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac"} Oct 08 22:24:27 crc kubenswrapper[4739]: I1008 22:24:27.951560 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerStarted","Data":"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6"} Oct 08 22:24:33 crc kubenswrapper[4739]: I1008 22:24:33.846487 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:33 crc kubenswrapper[4739]: I1008 22:24:33.847075 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:33 crc kubenswrapper[4739]: I1008 22:24:33.916011 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:33 crc kubenswrapper[4739]: I1008 22:24:33.940388 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mzqjb" podStartSLOduration=8.485744044 podStartE2EDuration="10.940370375s" podCreationTimestamp="2025-10-08 22:24:23 +0000 UTC" firstStartedPulling="2025-10-08 22:24:24.918773195 +0000 UTC m=+2164.744158945" lastFinishedPulling="2025-10-08 22:24:27.373399516 +0000 UTC m=+2167.198785276" observedRunningTime="2025-10-08 22:24:27.975928831 +0000 UTC m=+2167.801314591" watchObservedRunningTime="2025-10-08 22:24:33.940370375 +0000 UTC m=+2173.765756125" Oct 08 22:24:34 crc kubenswrapper[4739]: I1008 22:24:34.080079 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:34 crc kubenswrapper[4739]: I1008 22:24:34.153542 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.031457 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mzqjb" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="registry-server" containerID="cri-o://d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6" gracePeriod=2 Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.481415 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.639954 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content\") pod \"889a1840-b560-4536-8f9a-a63a5c218c3a\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.640058 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities\") pod \"889a1840-b560-4536-8f9a-a63a5c218c3a\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.640188 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z7jn\" (UniqueName: \"kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn\") pod \"889a1840-b560-4536-8f9a-a63a5c218c3a\" (UID: \"889a1840-b560-4536-8f9a-a63a5c218c3a\") " Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.641098 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities" (OuterVolumeSpecName: "utilities") pod "889a1840-b560-4536-8f9a-a63a5c218c3a" (UID: "889a1840-b560-4536-8f9a-a63a5c218c3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.646505 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn" (OuterVolumeSpecName: "kube-api-access-9z7jn") pod "889a1840-b560-4536-8f9a-a63a5c218c3a" (UID: "889a1840-b560-4536-8f9a-a63a5c218c3a"). InnerVolumeSpecName "kube-api-access-9z7jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.653867 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "889a1840-b560-4536-8f9a-a63a5c218c3a" (UID: "889a1840-b560-4536-8f9a-a63a5c218c3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.742409 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.742439 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889a1840-b560-4536-8f9a-a63a5c218c3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:36 crc kubenswrapper[4739]: I1008 22:24:36.742450 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z7jn\" (UniqueName: \"kubernetes.io/projected/889a1840-b560-4536-8f9a-a63a5c218c3a-kube-api-access-9z7jn\") on node \"crc\" DevicePath \"\"" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.043014 4739 generic.go:334] "Generic (PLEG): container finished" podID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerID="d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6" exitCode=0 Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.043095 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerDied","Data":"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6"} Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.043186 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzqjb" event={"ID":"889a1840-b560-4536-8f9a-a63a5c218c3a","Type":"ContainerDied","Data":"e2cfa2e91cdfe9c3b20218486edfc629a2c689a12807a2d351a494287c7eeb78"} Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.043213 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzqjb" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.043214 4739 scope.go:117] "RemoveContainer" containerID="d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.071968 4739 scope.go:117] "RemoveContainer" containerID="dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.082612 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.100107 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzqjb"] Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.113503 4739 scope.go:117] "RemoveContainer" containerID="14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.176510 4739 scope.go:117] "RemoveContainer" containerID="d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6" Oct 08 22:24:37 crc kubenswrapper[4739]: E1008 22:24:37.178486 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6\": container with ID starting with d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6 not found: ID does not exist" containerID="d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.178546 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6"} err="failed to get container status \"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6\": rpc error: code = NotFound desc = could not find container \"d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6\": container with ID starting with d5cacffa2f08f79c2ba22d789505a139b04cc2bd2c5ced0cef7fdb1053d67ce6 not found: ID does not exist" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.178572 4739 scope.go:117] "RemoveContainer" containerID="dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac" Oct 08 22:24:37 crc kubenswrapper[4739]: E1008 22:24:37.178871 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac\": container with ID starting with dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac not found: ID does not exist" containerID="dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.178895 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac"} err="failed to get container status \"dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac\": rpc error: code = NotFound desc = could not find container \"dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac\": container with ID starting with dd89b0be6881c348d8819690a12b592dd7b24d8229c93492fcc8bce3e7dd6bac not found: ID does not exist" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.178914 4739 scope.go:117] "RemoveContainer" containerID="14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf" Oct 08 22:24:37 crc kubenswrapper[4739]: E1008 22:24:37.179634 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf\": container with ID starting with 14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf not found: ID does not exist" containerID="14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.179664 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf"} err="failed to get container status \"14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf\": rpc error: code = NotFound desc = could not find container \"14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf\": container with ID starting with 14e26dedf655b08c6cc2fe36b80fb94495161af643b9d7a5351629b8e582c2cf not found: ID does not exist" Oct 08 22:24:37 crc kubenswrapper[4739]: I1008 22:24:37.834198 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" path="/var/lib/kubelet/pods/889a1840-b560-4536-8f9a-a63a5c218c3a/volumes" Oct 08 22:24:51 crc kubenswrapper[4739]: I1008 22:24:51.766993 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:24:51 crc kubenswrapper[4739]: I1008 22:24:51.767611 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:24:51 crc kubenswrapper[4739]: I1008 22:24:51.767665 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:24:51 crc kubenswrapper[4739]: I1008 22:24:51.768489 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:24:51 crc kubenswrapper[4739]: I1008 22:24:51.768547 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9" gracePeriod=600 Oct 08 22:24:52 crc kubenswrapper[4739]: I1008 22:24:52.214606 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9" exitCode=0 Oct 08 22:24:52 crc kubenswrapper[4739]: I1008 22:24:52.214671 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9"} Oct 08 22:24:52 crc kubenswrapper[4739]: I1008 22:24:52.215078 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d"} Oct 08 22:24:52 crc kubenswrapper[4739]: I1008 22:24:52.215105 4739 scope.go:117] "RemoveContainer" containerID="cd237d888e8c93206be5f4a7fb5248e050c6dfe54b10244705d72109bf176a06" Oct 08 22:25:20 crc kubenswrapper[4739]: I1008 22:25:20.544550 4739 generic.go:334] "Generic (PLEG): container finished" podID="8ad88d67-b089-4777-be25-7c61f66c18c7" containerID="cdecbd328b54dac432f2c6e99e8af22a767ffe23d812f822b0a6ca3cc5ff548c" exitCode=0 Oct 08 22:25:20 crc kubenswrapper[4739]: I1008 22:25:20.544639 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" event={"ID":"8ad88d67-b089-4777-be25-7c61f66c18c7","Type":"ContainerDied","Data":"cdecbd328b54dac432f2c6e99e8af22a767ffe23d812f822b0a6ca3cc5ff548c"} Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.021569 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.141030 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgxqw\" (UniqueName: \"kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw\") pod \"8ad88d67-b089-4777-be25-7c61f66c18c7\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.141378 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory\") pod \"8ad88d67-b089-4777-be25-7c61f66c18c7\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.141506 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key\") pod \"8ad88d67-b089-4777-be25-7c61f66c18c7\" (UID: \"8ad88d67-b089-4777-be25-7c61f66c18c7\") " Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.150066 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw" (OuterVolumeSpecName: "kube-api-access-jgxqw") pod "8ad88d67-b089-4777-be25-7c61f66c18c7" (UID: "8ad88d67-b089-4777-be25-7c61f66c18c7"). InnerVolumeSpecName "kube-api-access-jgxqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.182099 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory" (OuterVolumeSpecName: "inventory") pod "8ad88d67-b089-4777-be25-7c61f66c18c7" (UID: "8ad88d67-b089-4777-be25-7c61f66c18c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.187331 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8ad88d67-b089-4777-be25-7c61f66c18c7" (UID: "8ad88d67-b089-4777-be25-7c61f66c18c7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.244780 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.244840 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ad88d67-b089-4777-be25-7c61f66c18c7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.244866 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgxqw\" (UniqueName: \"kubernetes.io/projected/8ad88d67-b089-4777-be25-7c61f66c18c7-kube-api-access-jgxqw\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.581531 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" event={"ID":"8ad88d67-b089-4777-be25-7c61f66c18c7","Type":"ContainerDied","Data":"3408613e6083556b8999a19a203c4e48f445c0f1d0d0d2f5537a9785acdb4183"} Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.581628 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3408613e6083556b8999a19a203c4e48f445c0f1d0d0d2f5537a9785acdb4183" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.581711 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.717849 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s"] Oct 08 22:25:22 crc kubenswrapper[4739]: E1008 22:25:22.718355 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="registry-server" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718380 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="registry-server" Oct 08 22:25:22 crc kubenswrapper[4739]: E1008 22:25:22.718401 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="extract-content" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718409 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="extract-content" Oct 08 22:25:22 crc kubenswrapper[4739]: E1008 22:25:22.718438 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad88d67-b089-4777-be25-7c61f66c18c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718449 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad88d67-b089-4777-be25-7c61f66c18c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:25:22 crc kubenswrapper[4739]: E1008 22:25:22.718492 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="extract-utilities" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718500 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="extract-utilities" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718730 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad88d67-b089-4777-be25-7c61f66c18c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.718754 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="889a1840-b560-4536-8f9a-a63a5c218c3a" containerName="registry-server" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.719542 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.722256 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.724498 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.725305 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.725388 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.725539 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.727319 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.727478 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.727605 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.749551 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s"] Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.878964 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.879642 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.879827 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.879924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.880024 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.880075 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.880719 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881040 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5wm\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881221 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881328 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881477 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881568 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881654 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.881784 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984535 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984611 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984687 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984807 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984852 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.984969 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985013 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985059 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985096 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985133 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985270 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5wm\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985329 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985379 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.985452 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.993659 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.993893 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.995086 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.995755 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.996005 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.996047 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.996478 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.997340 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.997490 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.997707 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:22 crc kubenswrapper[4739]: I1008 22:25:22.997794 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:23 crc kubenswrapper[4739]: I1008 22:25:23.002700 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:23 crc kubenswrapper[4739]: I1008 22:25:23.003847 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:23 crc kubenswrapper[4739]: I1008 22:25:23.009194 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5wm\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:23 crc kubenswrapper[4739]: I1008 22:25:23.078882 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:25:23 crc kubenswrapper[4739]: I1008 22:25:23.696001 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s"] Oct 08 22:25:24 crc kubenswrapper[4739]: I1008 22:25:24.632786 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" event={"ID":"f2642ecf-dc6d-4f4e-94e7-2f76db914748","Type":"ContainerStarted","Data":"cb5892c49f9b0537e7b66a49813564c8e345f1780e6911f9ccfc44182879a6d9"} Oct 08 22:25:24 crc kubenswrapper[4739]: I1008 22:25:24.633799 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" event={"ID":"f2642ecf-dc6d-4f4e-94e7-2f76db914748","Type":"ContainerStarted","Data":"00714cc8b9d55fae8c3281811d64fdcd894be0d9d9f07675216ad2a08180adcc"} Oct 08 22:25:24 crc kubenswrapper[4739]: I1008 22:25:24.667406 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" podStartSLOduration=2.175680141 podStartE2EDuration="2.667028409s" podCreationTimestamp="2025-10-08 22:25:22 +0000 UTC" firstStartedPulling="2025-10-08 22:25:23.699530627 +0000 UTC m=+2223.524916387" lastFinishedPulling="2025-10-08 22:25:24.190878905 +0000 UTC m=+2224.016264655" observedRunningTime="2025-10-08 22:25:24.660342335 +0000 UTC m=+2224.485728125" watchObservedRunningTime="2025-10-08 22:25:24.667028409 +0000 UTC m=+2224.492414159" Oct 08 22:26:08 crc kubenswrapper[4739]: I1008 22:26:08.131881 4739 generic.go:334] "Generic (PLEG): container finished" podID="f2642ecf-dc6d-4f4e-94e7-2f76db914748" containerID="cb5892c49f9b0537e7b66a49813564c8e345f1780e6911f9ccfc44182879a6d9" exitCode=0 Oct 08 22:26:08 crc kubenswrapper[4739]: I1008 22:26:08.132525 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" event={"ID":"f2642ecf-dc6d-4f4e-94e7-2f76db914748","Type":"ContainerDied","Data":"cb5892c49f9b0537e7b66a49813564c8e345f1780e6911f9ccfc44182879a6d9"} Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.569819 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702344 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702399 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702435 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702458 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702510 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702533 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702581 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702654 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702759 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702785 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702871 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702901 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702934 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5wm\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.702961 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory\") pod \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\" (UID: \"f2642ecf-dc6d-4f4e-94e7-2f76db914748\") " Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.711449 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.711503 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.711531 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.712321 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.712466 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.713257 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.715327 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.715354 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.715379 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.715711 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.716894 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm" (OuterVolumeSpecName: "kube-api-access-gx5wm") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "kube-api-access-gx5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.716997 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.739217 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.765540 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory" (OuterVolumeSpecName: "inventory") pod "f2642ecf-dc6d-4f4e-94e7-2f76db914748" (UID: "f2642ecf-dc6d-4f4e-94e7-2f76db914748"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.804939 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.804970 4739 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.804982 4739 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.804990 4739 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805000 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805011 4739 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805020 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5wm\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-kube-api-access-gx5wm\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805029 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805037 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805044 4739 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805055 4739 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805063 4739 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2642ecf-dc6d-4f4e-94e7-2f76db914748-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805071 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:09 crc kubenswrapper[4739]: I1008 22:26:09.805082 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f2642ecf-dc6d-4f4e-94e7-2f76db914748-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.154107 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" event={"ID":"f2642ecf-dc6d-4f4e-94e7-2f76db914748","Type":"ContainerDied","Data":"00714cc8b9d55fae8c3281811d64fdcd894be0d9d9f07675216ad2a08180adcc"} Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.154458 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00714cc8b9d55fae8c3281811d64fdcd894be0d9d9f07675216ad2a08180adcc" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.154378 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.180344 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:10 crc kubenswrapper[4739]: E1008 22:26:10.180992 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2642ecf-dc6d-4f4e-94e7-2f76db914748" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.181020 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2642ecf-dc6d-4f4e-94e7-2f76db914748" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.181311 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2642ecf-dc6d-4f4e-94e7-2f76db914748" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.183481 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.187931 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.281191 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8"] Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.283170 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.285102 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.285565 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.285874 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.286195 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.288287 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.316566 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8"] Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.316975 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.317234 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.317567 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ld8q\" (UniqueName: \"kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.419885 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.420655 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.420862 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.420899 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.420925 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.421013 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgl8g\" (UniqueName: \"kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.421304 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.421357 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.421428 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.421455 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ld8q\" (UniqueName: \"kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.445039 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ld8q\" (UniqueName: \"kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q\") pod \"community-operators-9vxpb\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.523187 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.523507 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.523539 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgl8g\" (UniqueName: \"kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.523592 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.523613 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.524504 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.528212 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.529284 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.530591 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.544665 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.546856 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgl8g\" (UniqueName: \"kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zh4p8\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:10 crc kubenswrapper[4739]: I1008 22:26:10.608911 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:26:11 crc kubenswrapper[4739]: I1008 22:26:11.102856 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:11 crc kubenswrapper[4739]: I1008 22:26:11.165701 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerStarted","Data":"e9b0b82a95a4600b3c76663bf19aed8c36d589adb65fdb30257c67e63e6672b8"} Oct 08 22:26:11 crc kubenswrapper[4739]: I1008 22:26:11.312084 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8"] Oct 08 22:26:11 crc kubenswrapper[4739]: I1008 22:26:11.319611 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:26:12 crc kubenswrapper[4739]: I1008 22:26:12.179019 4739 generic.go:334] "Generic (PLEG): container finished" podID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerID="6bf4ba03dafe163cc03bf52fe8ba843d926ae4e239e197270e9298a0271a9d69" exitCode=0 Oct 08 22:26:12 crc kubenswrapper[4739]: I1008 22:26:12.179118 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerDied","Data":"6bf4ba03dafe163cc03bf52fe8ba843d926ae4e239e197270e9298a0271a9d69"} Oct 08 22:26:12 crc kubenswrapper[4739]: I1008 22:26:12.185787 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" event={"ID":"97d1ee4d-475f-4607-b01d-3d51e6ab179e","Type":"ContainerStarted","Data":"9d8b2e4001441f6207fa338265dbc622ff96162119ccf5bac59997b051b69f19"} Oct 08 22:26:14 crc kubenswrapper[4739]: I1008 22:26:14.210831 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" event={"ID":"97d1ee4d-475f-4607-b01d-3d51e6ab179e","Type":"ContainerStarted","Data":"9bb46a38f923018ea92167a38196a92928da099c60caacd130777dde9bb11ac3"} Oct 08 22:26:14 crc kubenswrapper[4739]: I1008 22:26:14.234172 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" podStartSLOduration=2.654174389 podStartE2EDuration="4.23413804s" podCreationTimestamp="2025-10-08 22:26:10 +0000 UTC" firstStartedPulling="2025-10-08 22:26:11.319334092 +0000 UTC m=+2271.144719842" lastFinishedPulling="2025-10-08 22:26:12.899297743 +0000 UTC m=+2272.724683493" observedRunningTime="2025-10-08 22:26:14.226336648 +0000 UTC m=+2274.051722398" watchObservedRunningTime="2025-10-08 22:26:14.23413804 +0000 UTC m=+2274.059523790" Oct 08 22:26:15 crc kubenswrapper[4739]: I1008 22:26:15.223831 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerStarted","Data":"fb24aa03ad760c547173dcebad3b0b8fd4392c6884427e55257c42c2048574d7"} Oct 08 22:26:17 crc kubenswrapper[4739]: I1008 22:26:17.247117 4739 generic.go:334] "Generic (PLEG): container finished" podID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerID="fb24aa03ad760c547173dcebad3b0b8fd4392c6884427e55257c42c2048574d7" exitCode=0 Oct 08 22:26:17 crc kubenswrapper[4739]: I1008 22:26:17.247180 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerDied","Data":"fb24aa03ad760c547173dcebad3b0b8fd4392c6884427e55257c42c2048574d7"} Oct 08 22:26:19 crc kubenswrapper[4739]: I1008 22:26:19.271632 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerStarted","Data":"b86a81eeb5712e5622092dd2a202ec714bf0dcf2e53ebe1518745a5846f7e184"} Oct 08 22:26:19 crc kubenswrapper[4739]: I1008 22:26:19.291864 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vxpb" podStartSLOduration=2.939683774 podStartE2EDuration="9.291838953s" podCreationTimestamp="2025-10-08 22:26:10 +0000 UTC" firstStartedPulling="2025-10-08 22:26:12.180945135 +0000 UTC m=+2272.006330885" lastFinishedPulling="2025-10-08 22:26:18.533100304 +0000 UTC m=+2278.358486064" observedRunningTime="2025-10-08 22:26:19.291366882 +0000 UTC m=+2279.116752672" watchObservedRunningTime="2025-10-08 22:26:19.291838953 +0000 UTC m=+2279.117224723" Oct 08 22:26:20 crc kubenswrapper[4739]: I1008 22:26:20.545850 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:20 crc kubenswrapper[4739]: I1008 22:26:20.546496 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:21 crc kubenswrapper[4739]: I1008 22:26:21.612417 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9vxpb" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="registry-server" probeResult="failure" output=< Oct 08 22:26:21 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:26:21 crc kubenswrapper[4739]: > Oct 08 22:26:30 crc kubenswrapper[4739]: I1008 22:26:30.611049 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:30 crc kubenswrapper[4739]: I1008 22:26:30.680592 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:30 crc kubenswrapper[4739]: I1008 22:26:30.856499 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:32 crc kubenswrapper[4739]: I1008 22:26:32.427525 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9vxpb" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="registry-server" containerID="cri-o://b86a81eeb5712e5622092dd2a202ec714bf0dcf2e53ebe1518745a5846f7e184" gracePeriod=2 Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.440656 4739 generic.go:334] "Generic (PLEG): container finished" podID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerID="b86a81eeb5712e5622092dd2a202ec714bf0dcf2e53ebe1518745a5846f7e184" exitCode=0 Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.440746 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerDied","Data":"b86a81eeb5712e5622092dd2a202ec714bf0dcf2e53ebe1518745a5846f7e184"} Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.545508 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.683371 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content\") pod \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.683542 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities\") pod \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.683706 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ld8q\" (UniqueName: \"kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q\") pod \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\" (UID: \"bd59484c-ad5e-4b95-9b6d-abbc7f224a17\") " Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.684434 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities" (OuterVolumeSpecName: "utilities") pod "bd59484c-ad5e-4b95-9b6d-abbc7f224a17" (UID: "bd59484c-ad5e-4b95-9b6d-abbc7f224a17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.689530 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q" (OuterVolumeSpecName: "kube-api-access-7ld8q") pod "bd59484c-ad5e-4b95-9b6d-abbc7f224a17" (UID: "bd59484c-ad5e-4b95-9b6d-abbc7f224a17"). InnerVolumeSpecName "kube-api-access-7ld8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.729966 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd59484c-ad5e-4b95-9b6d-abbc7f224a17" (UID: "bd59484c-ad5e-4b95-9b6d-abbc7f224a17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.786912 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.786949 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:33 crc kubenswrapper[4739]: I1008 22:26:33.786962 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ld8q\" (UniqueName: \"kubernetes.io/projected/bd59484c-ad5e-4b95-9b6d-abbc7f224a17-kube-api-access-7ld8q\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.456569 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vxpb" event={"ID":"bd59484c-ad5e-4b95-9b6d-abbc7f224a17","Type":"ContainerDied","Data":"e9b0b82a95a4600b3c76663bf19aed8c36d589adb65fdb30257c67e63e6672b8"} Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.457085 4739 scope.go:117] "RemoveContainer" containerID="b86a81eeb5712e5622092dd2a202ec714bf0dcf2e53ebe1518745a5846f7e184" Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.456662 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vxpb" Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.512050 4739 scope.go:117] "RemoveContainer" containerID="fb24aa03ad760c547173dcebad3b0b8fd4392c6884427e55257c42c2048574d7" Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.526302 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.538573 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9vxpb"] Oct 08 22:26:34 crc kubenswrapper[4739]: I1008 22:26:34.539960 4739 scope.go:117] "RemoveContainer" containerID="6bf4ba03dafe163cc03bf52fe8ba843d926ae4e239e197270e9298a0271a9d69" Oct 08 22:26:35 crc kubenswrapper[4739]: I1008 22:26:35.840453 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" path="/var/lib/kubelet/pods/bd59484c-ad5e-4b95-9b6d-abbc7f224a17/volumes" Oct 08 22:27:21 crc kubenswrapper[4739]: I1008 22:27:21.766020 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:27:21 crc kubenswrapper[4739]: I1008 22:27:21.766680 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:27:33 crc kubenswrapper[4739]: I1008 22:27:33.178566 4739 generic.go:334] "Generic (PLEG): container finished" podID="97d1ee4d-475f-4607-b01d-3d51e6ab179e" containerID="9bb46a38f923018ea92167a38196a92928da099c60caacd130777dde9bb11ac3" exitCode=0 Oct 08 22:27:33 crc kubenswrapper[4739]: I1008 22:27:33.178664 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" event={"ID":"97d1ee4d-475f-4607-b01d-3d51e6ab179e","Type":"ContainerDied","Data":"9bb46a38f923018ea92167a38196a92928da099c60caacd130777dde9bb11ac3"} Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.630494 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.771296 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory\") pod \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.771381 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0\") pod \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.771537 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgl8g\" (UniqueName: \"kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g\") pod \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.771672 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle\") pod \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.771790 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key\") pod \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\" (UID: \"97d1ee4d-475f-4607-b01d-3d51e6ab179e\") " Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.777903 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "97d1ee4d-475f-4607-b01d-3d51e6ab179e" (UID: "97d1ee4d-475f-4607-b01d-3d51e6ab179e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.781231 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g" (OuterVolumeSpecName: "kube-api-access-cgl8g") pod "97d1ee4d-475f-4607-b01d-3d51e6ab179e" (UID: "97d1ee4d-475f-4607-b01d-3d51e6ab179e"). InnerVolumeSpecName "kube-api-access-cgl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.805290 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "97d1ee4d-475f-4607-b01d-3d51e6ab179e" (UID: "97d1ee4d-475f-4607-b01d-3d51e6ab179e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.806993 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory" (OuterVolumeSpecName: "inventory") pod "97d1ee4d-475f-4607-b01d-3d51e6ab179e" (UID: "97d1ee4d-475f-4607-b01d-3d51e6ab179e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.809516 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "97d1ee4d-475f-4607-b01d-3d51e6ab179e" (UID: "97d1ee4d-475f-4607-b01d-3d51e6ab179e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.874130 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.874305 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.874365 4739 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.874430 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgl8g\" (UniqueName: \"kubernetes.io/projected/97d1ee4d-475f-4607-b01d-3d51e6ab179e-kube-api-access-cgl8g\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:34 crc kubenswrapper[4739]: I1008 22:27:34.874487 4739 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d1ee4d-475f-4607-b01d-3d51e6ab179e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.200084 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" event={"ID":"97d1ee4d-475f-4607-b01d-3d51e6ab179e","Type":"ContainerDied","Data":"9d8b2e4001441f6207fa338265dbc622ff96162119ccf5bac59997b051b69f19"} Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.200329 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d8b2e4001441f6207fa338265dbc622ff96162119ccf5bac59997b051b69f19" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.200188 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zh4p8" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.302403 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f"] Oct 08 22:27:35 crc kubenswrapper[4739]: E1008 22:27:35.302815 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="extract-utilities" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.302836 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="extract-utilities" Oct 08 22:27:35 crc kubenswrapper[4739]: E1008 22:27:35.302852 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="extract-content" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.302859 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="extract-content" Oct 08 22:27:35 crc kubenswrapper[4739]: E1008 22:27:35.302876 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="registry-server" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.302885 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="registry-server" Oct 08 22:27:35 crc kubenswrapper[4739]: E1008 22:27:35.302899 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d1ee4d-475f-4607-b01d-3d51e6ab179e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.302906 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d1ee4d-475f-4607-b01d-3d51e6ab179e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.303184 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d1ee4d-475f-4607-b01d-3d51e6ab179e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.303203 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd59484c-ad5e-4b95-9b6d-abbc7f224a17" containerName="registry-server" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.304043 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309000 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309065 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309214 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309354 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309417 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.309389 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.316515 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f"] Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.384176 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.384518 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.384660 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w829l\" (UniqueName: \"kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.384808 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.384941 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.385218 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486723 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486829 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486874 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486896 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w829l\" (UniqueName: \"kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486940 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.486962 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.492523 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.493477 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.495495 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.496303 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.504025 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.507955 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w829l\" (UniqueName: \"kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:35 crc kubenswrapper[4739]: I1008 22:27:35.666961 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:27:36 crc kubenswrapper[4739]: I1008 22:27:36.217034 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f"] Oct 08 22:27:37 crc kubenswrapper[4739]: I1008 22:27:37.224742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" event={"ID":"40321558-aaa1-4ba3-8417-69c969745cfa","Type":"ContainerStarted","Data":"078f1b9f4e3c4ba8fc9fde03d1a26d1b00c2471d32b1430238dd06e05d80d0d6"} Oct 08 22:27:38 crc kubenswrapper[4739]: I1008 22:27:38.238542 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" event={"ID":"40321558-aaa1-4ba3-8417-69c969745cfa","Type":"ContainerStarted","Data":"823eddfba2820e0f3786e265ec573270fab5d3159b6e2a058a1cb3d938690a67"} Oct 08 22:27:38 crc kubenswrapper[4739]: I1008 22:27:38.260795 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" podStartSLOduration=2.365931849 podStartE2EDuration="3.260775658s" podCreationTimestamp="2025-10-08 22:27:35 +0000 UTC" firstStartedPulling="2025-10-08 22:27:36.222575022 +0000 UTC m=+2356.047960792" lastFinishedPulling="2025-10-08 22:27:37.117418811 +0000 UTC m=+2356.942804601" observedRunningTime="2025-10-08 22:27:38.254920705 +0000 UTC m=+2358.080306455" watchObservedRunningTime="2025-10-08 22:27:38.260775658 +0000 UTC m=+2358.086161408" Oct 08 22:27:51 crc kubenswrapper[4739]: I1008 22:27:51.766680 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:27:51 crc kubenswrapper[4739]: I1008 22:27:51.767542 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:28:21 crc kubenswrapper[4739]: I1008 22:28:21.766184 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:28:21 crc kubenswrapper[4739]: I1008 22:28:21.767170 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:28:21 crc kubenswrapper[4739]: I1008 22:28:21.767236 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:28:21 crc kubenswrapper[4739]: I1008 22:28:21.768061 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:28:21 crc kubenswrapper[4739]: I1008 22:28:21.768161 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" gracePeriod=600 Oct 08 22:28:21 crc kubenswrapper[4739]: E1008 22:28:21.905380 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:28:22 crc kubenswrapper[4739]: I1008 22:28:22.687490 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" exitCode=0 Oct 08 22:28:22 crc kubenswrapper[4739]: I1008 22:28:22.687547 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d"} Oct 08 22:28:22 crc kubenswrapper[4739]: I1008 22:28:22.687636 4739 scope.go:117] "RemoveContainer" containerID="c69d7e306979356b55250a2871dae7b00a44dffff4dc7c74269008f52fd183a9" Oct 08 22:28:22 crc kubenswrapper[4739]: I1008 22:28:22.688296 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:28:22 crc kubenswrapper[4739]: E1008 22:28:22.688551 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:28:25 crc kubenswrapper[4739]: I1008 22:28:25.736350 4739 generic.go:334] "Generic (PLEG): container finished" podID="40321558-aaa1-4ba3-8417-69c969745cfa" containerID="823eddfba2820e0f3786e265ec573270fab5d3159b6e2a058a1cb3d938690a67" exitCode=0 Oct 08 22:28:25 crc kubenswrapper[4739]: I1008 22:28:25.736440 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" event={"ID":"40321558-aaa1-4ba3-8417-69c969745cfa","Type":"ContainerDied","Data":"823eddfba2820e0f3786e265ec573270fab5d3159b6e2a058a1cb3d938690a67"} Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.190483 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264373 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264495 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w829l\" (UniqueName: \"kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264527 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264555 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264644 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.264723 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory\") pod \"40321558-aaa1-4ba3-8417-69c969745cfa\" (UID: \"40321558-aaa1-4ba3-8417-69c969745cfa\") " Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.271416 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l" (OuterVolumeSpecName: "kube-api-access-w829l") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "kube-api-access-w829l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.272281 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.293274 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory" (OuterVolumeSpecName: "inventory") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.297683 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.301871 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.305295 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "40321558-aaa1-4ba3-8417-69c969745cfa" (UID: "40321558-aaa1-4ba3-8417-69c969745cfa"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369046 4739 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369125 4739 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369184 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369208 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369232 4739 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40321558-aaa1-4ba3-8417-69c969745cfa-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.369262 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w829l\" (UniqueName: \"kubernetes.io/projected/40321558-aaa1-4ba3-8417-69c969745cfa-kube-api-access-w829l\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.761958 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" event={"ID":"40321558-aaa1-4ba3-8417-69c969745cfa","Type":"ContainerDied","Data":"078f1b9f4e3c4ba8fc9fde03d1a26d1b00c2471d32b1430238dd06e05d80d0d6"} Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.762012 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="078f1b9f4e3c4ba8fc9fde03d1a26d1b00c2471d32b1430238dd06e05d80d0d6" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.762094 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.870017 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng"] Oct 08 22:28:27 crc kubenswrapper[4739]: E1008 22:28:27.871331 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40321558-aaa1-4ba3-8417-69c969745cfa" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.871360 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="40321558-aaa1-4ba3-8417-69c969745cfa" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.871730 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="40321558-aaa1-4ba3-8417-69c969745cfa" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.873369 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.879833 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.880293 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.880353 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.880398 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.882085 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng"] Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.885648 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.983039 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.983118 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.983214 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.983244 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:27 crc kubenswrapper[4739]: I1008 22:28:27.983334 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr8s\" (UniqueName: \"kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.085170 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npr8s\" (UniqueName: \"kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.085265 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.085305 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.085331 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.085350 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.090982 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.091936 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.092580 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.093371 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.106386 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr8s\" (UniqueName: \"kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wthng\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.198830 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.537687 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng"] Oct 08 22:28:28 crc kubenswrapper[4739]: I1008 22:28:28.771385 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" event={"ID":"076ace7f-41ce-4825-9d2f-e49471648888","Type":"ContainerStarted","Data":"2ff4f77331bd0d5a9236176be2fdd0d4e5460b0318069f0c256db54b2a1f4fe9"} Oct 08 22:28:29 crc kubenswrapper[4739]: I1008 22:28:29.790969 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" event={"ID":"076ace7f-41ce-4825-9d2f-e49471648888","Type":"ContainerStarted","Data":"278ce52eb4f2228e51f9eb8c9935abe0437eb0668b2e5ded50281e1e892ce700"} Oct 08 22:28:29 crc kubenswrapper[4739]: I1008 22:28:29.813281 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" podStartSLOduration=2.345399349 podStartE2EDuration="2.81326113s" podCreationTimestamp="2025-10-08 22:28:27 +0000 UTC" firstStartedPulling="2025-10-08 22:28:28.546115735 +0000 UTC m=+2408.371501485" lastFinishedPulling="2025-10-08 22:28:29.013977516 +0000 UTC m=+2408.839363266" observedRunningTime="2025-10-08 22:28:29.805414438 +0000 UTC m=+2409.630800208" watchObservedRunningTime="2025-10-08 22:28:29.81326113 +0000 UTC m=+2409.638646880" Oct 08 22:28:34 crc kubenswrapper[4739]: I1008 22:28:34.822026 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:28:34 crc kubenswrapper[4739]: E1008 22:28:34.823140 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:28:45 crc kubenswrapper[4739]: I1008 22:28:45.822037 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:28:45 crc kubenswrapper[4739]: E1008 22:28:45.823193 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:29:00 crc kubenswrapper[4739]: I1008 22:29:00.821491 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:29:00 crc kubenswrapper[4739]: E1008 22:29:00.822106 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:29:11 crc kubenswrapper[4739]: I1008 22:29:11.828487 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:29:11 crc kubenswrapper[4739]: E1008 22:29:11.829343 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:29:23 crc kubenswrapper[4739]: I1008 22:29:23.822131 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:29:23 crc kubenswrapper[4739]: E1008 22:29:23.823280 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:29:38 crc kubenswrapper[4739]: I1008 22:29:38.823035 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:29:38 crc kubenswrapper[4739]: E1008 22:29:38.824390 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:29:51 crc kubenswrapper[4739]: I1008 22:29:51.835364 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:29:51 crc kubenswrapper[4739]: E1008 22:29:51.836834 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.192780 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd"] Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.195394 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.202179 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.202438 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.226707 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd"] Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.250814 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.251001 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6dn\" (UniqueName: \"kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.251103 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.352869 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6dn\" (UniqueName: \"kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.352954 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.353116 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.355058 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.361570 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.374051 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6dn\" (UniqueName: \"kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn\") pod \"collect-profiles-29332710-f7mpd\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:00 crc kubenswrapper[4739]: I1008 22:30:00.529481 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:01 crc kubenswrapper[4739]: I1008 22:30:01.045035 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd"] Oct 08 22:30:01 crc kubenswrapper[4739]: I1008 22:30:01.811374 4739 generic.go:334] "Generic (PLEG): container finished" podID="c7b757ad-a730-483d-96aa-3e8b35a4a5b0" containerID="8bd57a31ea83e1bbbbddc4b9e6b12f5141047761d4d6749a97db0692715d0b14" exitCode=0 Oct 08 22:30:01 crc kubenswrapper[4739]: I1008 22:30:01.811581 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" event={"ID":"c7b757ad-a730-483d-96aa-3e8b35a4a5b0","Type":"ContainerDied","Data":"8bd57a31ea83e1bbbbddc4b9e6b12f5141047761d4d6749a97db0692715d0b14"} Oct 08 22:30:01 crc kubenswrapper[4739]: I1008 22:30:01.811822 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" event={"ID":"c7b757ad-a730-483d-96aa-3e8b35a4a5b0","Type":"ContainerStarted","Data":"f3afbe7229eac3c84dc56b1113b90a37e25982e9c528d5c9a26e6459f2bb4bb4"} Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.149124 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.207038 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj6dn\" (UniqueName: \"kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn\") pod \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.207280 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume\") pod \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.207333 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume\") pod \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\" (UID: \"c7b757ad-a730-483d-96aa-3e8b35a4a5b0\") " Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.220921 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7b757ad-a730-483d-96aa-3e8b35a4a5b0" (UID: "c7b757ad-a730-483d-96aa-3e8b35a4a5b0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.227473 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn" (OuterVolumeSpecName: "kube-api-access-xj6dn") pod "c7b757ad-a730-483d-96aa-3e8b35a4a5b0" (UID: "c7b757ad-a730-483d-96aa-3e8b35a4a5b0"). InnerVolumeSpecName "kube-api-access-xj6dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.228138 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7b757ad-a730-483d-96aa-3e8b35a4a5b0" (UID: "c7b757ad-a730-483d-96aa-3e8b35a4a5b0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.309797 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.309847 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.309862 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj6dn\" (UniqueName: \"kubernetes.io/projected/c7b757ad-a730-483d-96aa-3e8b35a4a5b0-kube-api-access-xj6dn\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.822874 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:30:03 crc kubenswrapper[4739]: E1008 22:30:03.823080 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.832307 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.834751 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-f7mpd" event={"ID":"c7b757ad-a730-483d-96aa-3e8b35a4a5b0","Type":"ContainerDied","Data":"f3afbe7229eac3c84dc56b1113b90a37e25982e9c528d5c9a26e6459f2bb4bb4"} Oct 08 22:30:03 crc kubenswrapper[4739]: I1008 22:30:03.834787 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3afbe7229eac3c84dc56b1113b90a37e25982e9c528d5c9a26e6459f2bb4bb4" Oct 08 22:30:04 crc kubenswrapper[4739]: I1008 22:30:04.221592 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4"] Oct 08 22:30:04 crc kubenswrapper[4739]: I1008 22:30:04.229350 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332665-fv8l4"] Oct 08 22:30:05 crc kubenswrapper[4739]: I1008 22:30:05.839279 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfbbfca-3095-4a7b-869e-70b1a86046c4" path="/var/lib/kubelet/pods/dbfbbfca-3095-4a7b-869e-70b1a86046c4/volumes" Oct 08 22:30:15 crc kubenswrapper[4739]: I1008 22:30:15.822861 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:30:15 crc kubenswrapper[4739]: E1008 22:30:15.824426 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:30:26 crc kubenswrapper[4739]: I1008 22:30:26.823276 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:30:26 crc kubenswrapper[4739]: E1008 22:30:26.824186 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:30:32 crc kubenswrapper[4739]: I1008 22:30:32.789579 4739 scope.go:117] "RemoveContainer" containerID="6f67e28d93ba7dcaa2e33ca721836c5d3d40506f1f2686e98b7538e9908c1374" Oct 08 22:30:37 crc kubenswrapper[4739]: I1008 22:30:37.822534 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:30:37 crc kubenswrapper[4739]: E1008 22:30:37.823584 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:30:50 crc kubenswrapper[4739]: I1008 22:30:50.822929 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:30:50 crc kubenswrapper[4739]: E1008 22:30:50.824057 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:31:01 crc kubenswrapper[4739]: I1008 22:31:01.828566 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:31:01 crc kubenswrapper[4739]: E1008 22:31:01.829351 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:31:13 crc kubenswrapper[4739]: I1008 22:31:13.823329 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:31:13 crc kubenswrapper[4739]: E1008 22:31:13.825294 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:31:25 crc kubenswrapper[4739]: I1008 22:31:25.821991 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:31:25 crc kubenswrapper[4739]: E1008 22:31:25.822711 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:31:37 crc kubenswrapper[4739]: I1008 22:31:37.822051 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:31:37 crc kubenswrapper[4739]: E1008 22:31:37.823425 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:31:49 crc kubenswrapper[4739]: I1008 22:31:49.821929 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:31:49 crc kubenswrapper[4739]: E1008 22:31:49.822859 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:32:03 crc kubenswrapper[4739]: I1008 22:32:03.823048 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:32:03 crc kubenswrapper[4739]: E1008 22:32:03.824542 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.147851 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:05 crc kubenswrapper[4739]: E1008 22:32:05.148382 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b757ad-a730-483d-96aa-3e8b35a4a5b0" containerName="collect-profiles" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.148398 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b757ad-a730-483d-96aa-3e8b35a4a5b0" containerName="collect-profiles" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.148723 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b757ad-a730-483d-96aa-3e8b35a4a5b0" containerName="collect-profiles" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.150833 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.157972 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.294355 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.295395 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm85g\" (UniqueName: \"kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.295497 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.397401 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm85g\" (UniqueName: \"kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.397444 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.397544 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.397939 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.398041 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.419114 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm85g\" (UniqueName: \"kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g\") pod \"redhat-operators-2qd2w\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.518180 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:05 crc kubenswrapper[4739]: I1008 22:32:05.982756 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:06 crc kubenswrapper[4739]: I1008 22:32:06.216572 4739 generic.go:334] "Generic (PLEG): container finished" podID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerID="5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99" exitCode=0 Oct 08 22:32:06 crc kubenswrapper[4739]: I1008 22:32:06.216614 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerDied","Data":"5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99"} Oct 08 22:32:06 crc kubenswrapper[4739]: I1008 22:32:06.216641 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerStarted","Data":"2da3b61eaca02875ec7f7e7db6e4ef079f9fefb893f2082a8b23d81769099bd2"} Oct 08 22:32:06 crc kubenswrapper[4739]: I1008 22:32:06.218799 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:32:07 crc kubenswrapper[4739]: I1008 22:32:07.228129 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerStarted","Data":"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc"} Oct 08 22:32:08 crc kubenswrapper[4739]: I1008 22:32:08.238068 4739 generic.go:334] "Generic (PLEG): container finished" podID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerID="3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc" exitCode=0 Oct 08 22:32:08 crc kubenswrapper[4739]: I1008 22:32:08.238137 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerDied","Data":"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc"} Oct 08 22:32:09 crc kubenswrapper[4739]: I1008 22:32:09.248363 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerStarted","Data":"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5"} Oct 08 22:32:09 crc kubenswrapper[4739]: I1008 22:32:09.280375 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qd2w" podStartSLOduration=1.7981666710000002 podStartE2EDuration="4.280346152s" podCreationTimestamp="2025-10-08 22:32:05 +0000 UTC" firstStartedPulling="2025-10-08 22:32:06.218592638 +0000 UTC m=+2626.043978388" lastFinishedPulling="2025-10-08 22:32:08.700772119 +0000 UTC m=+2628.526157869" observedRunningTime="2025-10-08 22:32:09.269102746 +0000 UTC m=+2629.094488516" watchObservedRunningTime="2025-10-08 22:32:09.280346152 +0000 UTC m=+2629.105731902" Oct 08 22:32:15 crc kubenswrapper[4739]: I1008 22:32:15.519309 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:15 crc kubenswrapper[4739]: I1008 22:32:15.519923 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:15 crc kubenswrapper[4739]: I1008 22:32:15.570447 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:16 crc kubenswrapper[4739]: I1008 22:32:16.360120 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:16 crc kubenswrapper[4739]: I1008 22:32:16.412948 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:17 crc kubenswrapper[4739]: I1008 22:32:17.822677 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:32:17 crc kubenswrapper[4739]: E1008 22:32:17.823587 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.332751 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qd2w" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="registry-server" containerID="cri-o://fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5" gracePeriod=2 Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.868422 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.986813 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm85g\" (UniqueName: \"kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g\") pod \"bcf9428e-7974-49ea-98c2-ce71f3976eab\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.987017 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities\") pod \"bcf9428e-7974-49ea-98c2-ce71f3976eab\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.987118 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content\") pod \"bcf9428e-7974-49ea-98c2-ce71f3976eab\" (UID: \"bcf9428e-7974-49ea-98c2-ce71f3976eab\") " Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.988368 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities" (OuterVolumeSpecName: "utilities") pod "bcf9428e-7974-49ea-98c2-ce71f3976eab" (UID: "bcf9428e-7974-49ea-98c2-ce71f3976eab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:32:18 crc kubenswrapper[4739]: I1008 22:32:18.996061 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g" (OuterVolumeSpecName: "kube-api-access-cm85g") pod "bcf9428e-7974-49ea-98c2-ce71f3976eab" (UID: "bcf9428e-7974-49ea-98c2-ce71f3976eab"). InnerVolumeSpecName "kube-api-access-cm85g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.089003 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcf9428e-7974-49ea-98c2-ce71f3976eab" (UID: "bcf9428e-7974-49ea-98c2-ce71f3976eab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.090501 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.090566 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm85g\" (UniqueName: \"kubernetes.io/projected/bcf9428e-7974-49ea-98c2-ce71f3976eab-kube-api-access-cm85g\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.090591 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcf9428e-7974-49ea-98c2-ce71f3976eab-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.344620 4739 generic.go:334] "Generic (PLEG): container finished" podID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerID="fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5" exitCode=0 Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.344706 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qd2w" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.344746 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerDied","Data":"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5"} Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.344793 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qd2w" event={"ID":"bcf9428e-7974-49ea-98c2-ce71f3976eab","Type":"ContainerDied","Data":"2da3b61eaca02875ec7f7e7db6e4ef079f9fefb893f2082a8b23d81769099bd2"} Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.344844 4739 scope.go:117] "RemoveContainer" containerID="fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.388856 4739 scope.go:117] "RemoveContainer" containerID="3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.399223 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.410268 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qd2w"] Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.425345 4739 scope.go:117] "RemoveContainer" containerID="5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.476910 4739 scope.go:117] "RemoveContainer" containerID="fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5" Oct 08 22:32:19 crc kubenswrapper[4739]: E1008 22:32:19.477692 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5\": container with ID starting with fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5 not found: ID does not exist" containerID="fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.477773 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5"} err="failed to get container status \"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5\": rpc error: code = NotFound desc = could not find container \"fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5\": container with ID starting with fc20fabe9f232c31ac2ac823b931c5bcbb168dd4f17ff1bcef053309d1601fd5 not found: ID does not exist" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.477819 4739 scope.go:117] "RemoveContainer" containerID="3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc" Oct 08 22:32:19 crc kubenswrapper[4739]: E1008 22:32:19.478532 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc\": container with ID starting with 3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc not found: ID does not exist" containerID="3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.478607 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc"} err="failed to get container status \"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc\": rpc error: code = NotFound desc = could not find container \"3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc\": container with ID starting with 3c7d152b1057dee61181d1afff3590cdf78a83d02d6998726d5a1317c58f1bdc not found: ID does not exist" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.478646 4739 scope.go:117] "RemoveContainer" containerID="5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99" Oct 08 22:32:19 crc kubenswrapper[4739]: E1008 22:32:19.478988 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99\": container with ID starting with 5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99 not found: ID does not exist" containerID="5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.479023 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99"} err="failed to get container status \"5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99\": rpc error: code = NotFound desc = could not find container \"5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99\": container with ID starting with 5b4d07979f991c37e808c8b6d1041a935d495927b0a42a7f4cb7ed0ea6c8ef99 not found: ID does not exist" Oct 08 22:32:19 crc kubenswrapper[4739]: I1008 22:32:19.840710 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" path="/var/lib/kubelet/pods/bcf9428e-7974-49ea-98c2-ce71f3976eab/volumes" Oct 08 22:32:32 crc kubenswrapper[4739]: I1008 22:32:32.822490 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:32:32 crc kubenswrapper[4739]: E1008 22:32:32.824545 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:32:47 crc kubenswrapper[4739]: I1008 22:32:47.822628 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:32:47 crc kubenswrapper[4739]: E1008 22:32:47.823493 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:32:53 crc kubenswrapper[4739]: I1008 22:32:53.718888 4739 generic.go:334] "Generic (PLEG): container finished" podID="076ace7f-41ce-4825-9d2f-e49471648888" containerID="278ce52eb4f2228e51f9eb8c9935abe0437eb0668b2e5ded50281e1e892ce700" exitCode=0 Oct 08 22:32:53 crc kubenswrapper[4739]: I1008 22:32:53.718970 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" event={"ID":"076ace7f-41ce-4825-9d2f-e49471648888","Type":"ContainerDied","Data":"278ce52eb4f2228e51f9eb8c9935abe0437eb0668b2e5ded50281e1e892ce700"} Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.249945 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.365272 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key\") pod \"076ace7f-41ce-4825-9d2f-e49471648888\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.365352 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npr8s\" (UniqueName: \"kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s\") pod \"076ace7f-41ce-4825-9d2f-e49471648888\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.365551 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0\") pod \"076ace7f-41ce-4825-9d2f-e49471648888\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.365634 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle\") pod \"076ace7f-41ce-4825-9d2f-e49471648888\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.365703 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory\") pod \"076ace7f-41ce-4825-9d2f-e49471648888\" (UID: \"076ace7f-41ce-4825-9d2f-e49471648888\") " Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.371613 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "076ace7f-41ce-4825-9d2f-e49471648888" (UID: "076ace7f-41ce-4825-9d2f-e49471648888"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.372194 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s" (OuterVolumeSpecName: "kube-api-access-npr8s") pod "076ace7f-41ce-4825-9d2f-e49471648888" (UID: "076ace7f-41ce-4825-9d2f-e49471648888"). InnerVolumeSpecName "kube-api-access-npr8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.393494 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "076ace7f-41ce-4825-9d2f-e49471648888" (UID: "076ace7f-41ce-4825-9d2f-e49471648888"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.398342 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory" (OuterVolumeSpecName: "inventory") pod "076ace7f-41ce-4825-9d2f-e49471648888" (UID: "076ace7f-41ce-4825-9d2f-e49471648888"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.399112 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "076ace7f-41ce-4825-9d2f-e49471648888" (UID: "076ace7f-41ce-4825-9d2f-e49471648888"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.468355 4739 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.468680 4739 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.468698 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.468709 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ace7f-41ce-4825-9d2f-e49471648888-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.468720 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npr8s\" (UniqueName: \"kubernetes.io/projected/076ace7f-41ce-4825-9d2f-e49471648888-kube-api-access-npr8s\") on node \"crc\" DevicePath \"\"" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.744677 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" event={"ID":"076ace7f-41ce-4825-9d2f-e49471648888","Type":"ContainerDied","Data":"2ff4f77331bd0d5a9236176be2fdd0d4e5460b0318069f0c256db54b2a1f4fe9"} Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.744721 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff4f77331bd0d5a9236176be2fdd0d4e5460b0318069f0c256db54b2a1f4fe9" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.744731 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wthng" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.836928 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg"] Oct 08 22:32:55 crc kubenswrapper[4739]: E1008 22:32:55.837253 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="extract-utilities" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837268 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="extract-utilities" Oct 08 22:32:55 crc kubenswrapper[4739]: E1008 22:32:55.837301 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="extract-content" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837307 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="extract-content" Oct 08 22:32:55 crc kubenswrapper[4739]: E1008 22:32:55.837319 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="registry-server" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837326 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="registry-server" Oct 08 22:32:55 crc kubenswrapper[4739]: E1008 22:32:55.837343 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076ace7f-41ce-4825-9d2f-e49471648888" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837350 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="076ace7f-41ce-4825-9d2f-e49471648888" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837525 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf9428e-7974-49ea-98c2-ce71f3976eab" containerName="registry-server" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.837553 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="076ace7f-41ce-4825-9d2f-e49471648888" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.839044 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg"] Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.839310 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.844013 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.844171 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.844739 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.844859 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.844989 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.845212 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.846429 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983369 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983435 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983553 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78n6g\" (UniqueName: \"kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983624 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983708 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983760 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.983932 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.984006 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:55 crc kubenswrapper[4739]: I1008 22:32:55.984040 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086177 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086229 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086250 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086320 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086348 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086378 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78n6g\" (UniqueName: \"kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086424 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086460 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.086499 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.087324 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.091583 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.091647 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.091707 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.092180 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.092186 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.100590 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.102661 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.103499 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78n6g\" (UniqueName: \"kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-s2cpg\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.176182 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.683269 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg"] Oct 08 22:32:56 crc kubenswrapper[4739]: I1008 22:32:56.755925 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" event={"ID":"02cc9be0-080a-4ef8-a438-18607a5c7da4","Type":"ContainerStarted","Data":"0bbe6c4288c4294ced634cb1c8fae7acac366c5281435e634380d6feafff5f88"} Oct 08 22:32:57 crc kubenswrapper[4739]: I1008 22:32:57.767914 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" event={"ID":"02cc9be0-080a-4ef8-a438-18607a5c7da4","Type":"ContainerStarted","Data":"02736ab9c87f2348cafc664ebe866f8bcefa3fc3cab9c19e5b1d8b7af8ff765b"} Oct 08 22:33:00 crc kubenswrapper[4739]: I1008 22:33:00.821799 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:33:00 crc kubenswrapper[4739]: E1008 22:33:00.822658 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:33:12 crc kubenswrapper[4739]: I1008 22:33:12.822666 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:33:12 crc kubenswrapper[4739]: E1008 22:33:12.823573 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:33:24 crc kubenswrapper[4739]: I1008 22:33:24.821757 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:33:26 crc kubenswrapper[4739]: I1008 22:33:26.067410 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90"} Oct 08 22:33:26 crc kubenswrapper[4739]: I1008 22:33:26.084868 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" podStartSLOduration=30.498392797 podStartE2EDuration="31.084846448s" podCreationTimestamp="2025-10-08 22:32:55 +0000 UTC" firstStartedPulling="2025-10-08 22:32:56.688754668 +0000 UTC m=+2676.514140418" lastFinishedPulling="2025-10-08 22:32:57.275208309 +0000 UTC m=+2677.100594069" observedRunningTime="2025-10-08 22:32:57.798597543 +0000 UTC m=+2677.623983293" watchObservedRunningTime="2025-10-08 22:33:26.084846448 +0000 UTC m=+2705.910232198" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.115752 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.120013 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.137183 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.276951 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.277563 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.277801 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm5p5\" (UniqueName: \"kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.380560 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.381024 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm5p5\" (UniqueName: \"kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.381306 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.381440 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.381823 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.403938 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm5p5\" (UniqueName: \"kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5\") pod \"certified-operators-xdrxq\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:49 crc kubenswrapper[4739]: I1008 22:33:49.501844 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:50 crc kubenswrapper[4739]: I1008 22:33:50.020573 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:33:50 crc kubenswrapper[4739]: I1008 22:33:50.316920 4739 generic.go:334] "Generic (PLEG): container finished" podID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerID="a5f45cc1ecd4316212b555383d1e27239456617a70321e7cc042b6c34dce6d5a" exitCode=0 Oct 08 22:33:50 crc kubenswrapper[4739]: I1008 22:33:50.316987 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerDied","Data":"a5f45cc1ecd4316212b555383d1e27239456617a70321e7cc042b6c34dce6d5a"} Oct 08 22:33:50 crc kubenswrapper[4739]: I1008 22:33:50.317025 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerStarted","Data":"7a9e3b14787726d76c06027c56acda7349d84d0dc91b914d8309e7a6a08b0719"} Oct 08 22:33:52 crc kubenswrapper[4739]: I1008 22:33:52.343563 4739 generic.go:334] "Generic (PLEG): container finished" podID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerID="63a0476c7d88447a92314b52215fdd25104ceb323c7d4dac4afc199b63981b9c" exitCode=0 Oct 08 22:33:52 crc kubenswrapper[4739]: I1008 22:33:52.343608 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerDied","Data":"63a0476c7d88447a92314b52215fdd25104ceb323c7d4dac4afc199b63981b9c"} Oct 08 22:33:53 crc kubenswrapper[4739]: I1008 22:33:53.356314 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerStarted","Data":"728148e59b4f1beba5ec8102d7fce08f597f9b40b0c9bde4b278ce4a5a13f936"} Oct 08 22:33:53 crc kubenswrapper[4739]: I1008 22:33:53.386298 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdrxq" podStartSLOduration=1.93560101 podStartE2EDuration="4.386268832s" podCreationTimestamp="2025-10-08 22:33:49 +0000 UTC" firstStartedPulling="2025-10-08 22:33:50.319072218 +0000 UTC m=+2730.144457978" lastFinishedPulling="2025-10-08 22:33:52.76974005 +0000 UTC m=+2732.595125800" observedRunningTime="2025-10-08 22:33:53.37561865 +0000 UTC m=+2733.201004400" watchObservedRunningTime="2025-10-08 22:33:53.386268832 +0000 UTC m=+2733.211654602" Oct 08 22:33:59 crc kubenswrapper[4739]: I1008 22:33:59.502701 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:59 crc kubenswrapper[4739]: I1008 22:33:59.503434 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:33:59 crc kubenswrapper[4739]: I1008 22:33:59.571232 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:34:00 crc kubenswrapper[4739]: I1008 22:34:00.478097 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:34:00 crc kubenswrapper[4739]: I1008 22:34:00.529628 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:34:02 crc kubenswrapper[4739]: I1008 22:34:02.451903 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdrxq" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="registry-server" containerID="cri-o://728148e59b4f1beba5ec8102d7fce08f597f9b40b0c9bde4b278ce4a5a13f936" gracePeriod=2 Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.465766 4739 generic.go:334] "Generic (PLEG): container finished" podID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerID="728148e59b4f1beba5ec8102d7fce08f597f9b40b0c9bde4b278ce4a5a13f936" exitCode=0 Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.465857 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerDied","Data":"728148e59b4f1beba5ec8102d7fce08f597f9b40b0c9bde4b278ce4a5a13f936"} Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.466232 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdrxq" event={"ID":"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd","Type":"ContainerDied","Data":"7a9e3b14787726d76c06027c56acda7349d84d0dc91b914d8309e7a6a08b0719"} Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.466271 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9e3b14787726d76c06027c56acda7349d84d0dc91b914d8309e7a6a08b0719" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.543118 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.573551 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm5p5\" (UniqueName: \"kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5\") pod \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.574202 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities\") pod \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.574584 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content\") pod \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\" (UID: \"70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd\") " Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.576343 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities" (OuterVolumeSpecName: "utilities") pod "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" (UID: "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.581098 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.589738 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5" (OuterVolumeSpecName: "kube-api-access-dm5p5") pod "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" (UID: "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd"). InnerVolumeSpecName "kube-api-access-dm5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.617386 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" (UID: "70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.682390 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm5p5\" (UniqueName: \"kubernetes.io/projected/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-kube-api-access-dm5p5\") on node \"crc\" DevicePath \"\"" Oct 08 22:34:03 crc kubenswrapper[4739]: I1008 22:34:03.682425 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:34:04 crc kubenswrapper[4739]: I1008 22:34:04.474653 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdrxq" Oct 08 22:34:04 crc kubenswrapper[4739]: I1008 22:34:04.507514 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:34:04 crc kubenswrapper[4739]: I1008 22:34:04.515250 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdrxq"] Oct 08 22:34:05 crc kubenswrapper[4739]: I1008 22:34:05.835347 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" path="/var/lib/kubelet/pods/70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd/volumes" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.039560 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:11 crc kubenswrapper[4739]: E1008 22:35:11.040745 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="extract-content" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.040767 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="extract-content" Oct 08 22:35:11 crc kubenswrapper[4739]: E1008 22:35:11.040786 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="extract-utilities" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.040796 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="extract-utilities" Oct 08 22:35:11 crc kubenswrapper[4739]: E1008 22:35:11.040813 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="registry-server" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.040821 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="registry-server" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.041064 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bd7e07-bcfe-4d07-bd52-7a5a8bfc26fd" containerName="registry-server" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.042889 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.049293 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.208730 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bb4x\" (UniqueName: \"kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.210295 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.210525 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.311903 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.311992 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.312085 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bb4x\" (UniqueName: \"kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.312497 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.312717 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.330958 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bb4x\" (UniqueName: \"kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x\") pod \"redhat-marketplace-rkmbg\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.419211 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:11 crc kubenswrapper[4739]: I1008 22:35:11.890889 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:12 crc kubenswrapper[4739]: I1008 22:35:12.351391 4739 generic.go:334] "Generic (PLEG): container finished" podID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerID="c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a" exitCode=0 Oct 08 22:35:12 crc kubenswrapper[4739]: I1008 22:35:12.351434 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerDied","Data":"c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a"} Oct 08 22:35:12 crc kubenswrapper[4739]: I1008 22:35:12.351763 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerStarted","Data":"e02a8e36234f14a01aee1992f2115879e0f19748957b6a8dbade4a05f8456afa"} Oct 08 22:35:14 crc kubenswrapper[4739]: I1008 22:35:14.378662 4739 generic.go:334] "Generic (PLEG): container finished" podID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerID="dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac" exitCode=0 Oct 08 22:35:14 crc kubenswrapper[4739]: I1008 22:35:14.378728 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerDied","Data":"dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac"} Oct 08 22:35:15 crc kubenswrapper[4739]: I1008 22:35:15.396235 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerStarted","Data":"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7"} Oct 08 22:35:15 crc kubenswrapper[4739]: I1008 22:35:15.431222 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkmbg" podStartSLOduration=1.979926718 podStartE2EDuration="4.431202136s" podCreationTimestamp="2025-10-08 22:35:11 +0000 UTC" firstStartedPulling="2025-10-08 22:35:12.354661923 +0000 UTC m=+2812.180047713" lastFinishedPulling="2025-10-08 22:35:14.805937341 +0000 UTC m=+2814.631323131" observedRunningTime="2025-10-08 22:35:15.428199862 +0000 UTC m=+2815.253585622" watchObservedRunningTime="2025-10-08 22:35:15.431202136 +0000 UTC m=+2815.256587886" Oct 08 22:35:21 crc kubenswrapper[4739]: I1008 22:35:21.419953 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:21 crc kubenswrapper[4739]: I1008 22:35:21.420709 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:21 crc kubenswrapper[4739]: I1008 22:35:21.491050 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:21 crc kubenswrapper[4739]: I1008 22:35:21.563688 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:22 crc kubenswrapper[4739]: I1008 22:35:22.018462 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:23 crc kubenswrapper[4739]: I1008 22:35:23.491817 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkmbg" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="registry-server" containerID="cri-o://d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7" gracePeriod=2 Oct 08 22:35:23 crc kubenswrapper[4739]: I1008 22:35:23.968098 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.096205 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bb4x\" (UniqueName: \"kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x\") pod \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.096316 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content\") pod \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.096345 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities\") pod \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\" (UID: \"25e21c40-6ddd-42ad-a970-0aee42b3dfae\") " Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.097588 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities" (OuterVolumeSpecName: "utilities") pod "25e21c40-6ddd-42ad-a970-0aee42b3dfae" (UID: "25e21c40-6ddd-42ad-a970-0aee42b3dfae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.102773 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x" (OuterVolumeSpecName: "kube-api-access-9bb4x") pod "25e21c40-6ddd-42ad-a970-0aee42b3dfae" (UID: "25e21c40-6ddd-42ad-a970-0aee42b3dfae"). InnerVolumeSpecName "kube-api-access-9bb4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.125917 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25e21c40-6ddd-42ad-a970-0aee42b3dfae" (UID: "25e21c40-6ddd-42ad-a970-0aee42b3dfae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.198411 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.198439 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e21c40-6ddd-42ad-a970-0aee42b3dfae-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.198450 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bb4x\" (UniqueName: \"kubernetes.io/projected/25e21c40-6ddd-42ad-a970-0aee42b3dfae-kube-api-access-9bb4x\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.503995 4739 generic.go:334] "Generic (PLEG): container finished" podID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerID="d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7" exitCode=0 Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.504062 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerDied","Data":"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7"} Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.504103 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkmbg" event={"ID":"25e21c40-6ddd-42ad-a970-0aee42b3dfae","Type":"ContainerDied","Data":"e02a8e36234f14a01aee1992f2115879e0f19748957b6a8dbade4a05f8456afa"} Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.504133 4739 scope.go:117] "RemoveContainer" containerID="d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.504342 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkmbg" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.546321 4739 scope.go:117] "RemoveContainer" containerID="dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.569225 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.579724 4739 scope.go:117] "RemoveContainer" containerID="c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.581033 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkmbg"] Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.626786 4739 scope.go:117] "RemoveContainer" containerID="d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7" Oct 08 22:35:24 crc kubenswrapper[4739]: E1008 22:35:24.627212 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7\": container with ID starting with d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7 not found: ID does not exist" containerID="d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.627251 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7"} err="failed to get container status \"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7\": rpc error: code = NotFound desc = could not find container \"d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7\": container with ID starting with d3a1305c9fe1dde9165ea0e849f0bf55ffcf3c0fba0a09ab5e5ac5add089ade7 not found: ID does not exist" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.627280 4739 scope.go:117] "RemoveContainer" containerID="dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac" Oct 08 22:35:24 crc kubenswrapper[4739]: E1008 22:35:24.627782 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac\": container with ID starting with dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac not found: ID does not exist" containerID="dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.627812 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac"} err="failed to get container status \"dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac\": rpc error: code = NotFound desc = could not find container \"dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac\": container with ID starting with dce2a955185788b290b13293c25c15298a41a77f13ee65cc40b062d7f025cbac not found: ID does not exist" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.627832 4739 scope.go:117] "RemoveContainer" containerID="c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a" Oct 08 22:35:24 crc kubenswrapper[4739]: E1008 22:35:24.628204 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a\": container with ID starting with c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a not found: ID does not exist" containerID="c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a" Oct 08 22:35:24 crc kubenswrapper[4739]: I1008 22:35:24.628238 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a"} err="failed to get container status \"c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a\": rpc error: code = NotFound desc = could not find container \"c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a\": container with ID starting with c3985e4dc58f06a38ee080e2bce0e52914e296bb5d13795a35bd525057fb794a not found: ID does not exist" Oct 08 22:35:25 crc kubenswrapper[4739]: I1008 22:35:25.835332 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" path="/var/lib/kubelet/pods/25e21c40-6ddd-42ad-a970-0aee42b3dfae/volumes" Oct 08 22:35:51 crc kubenswrapper[4739]: I1008 22:35:51.766936 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:35:51 crc kubenswrapper[4739]: I1008 22:35:51.767956 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:36:21 crc kubenswrapper[4739]: I1008 22:36:21.766331 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:36:21 crc kubenswrapper[4739]: I1008 22:36:21.767051 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:36:39 crc kubenswrapper[4739]: I1008 22:36:39.414695 4739 generic.go:334] "Generic (PLEG): container finished" podID="02cc9be0-080a-4ef8-a438-18607a5c7da4" containerID="02736ab9c87f2348cafc664ebe866f8bcefa3fc3cab9c19e5b1d8b7af8ff765b" exitCode=0 Oct 08 22:36:39 crc kubenswrapper[4739]: I1008 22:36:39.414814 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" event={"ID":"02cc9be0-080a-4ef8-a438-18607a5c7da4","Type":"ContainerDied","Data":"02736ab9c87f2348cafc664ebe866f8bcefa3fc3cab9c19e5b1d8b7af8ff765b"} Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.867478 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.965186 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78n6g\" (UniqueName: \"kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.965271 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966093 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966186 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966218 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966321 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966364 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966405 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.966441 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1\") pod \"02cc9be0-080a-4ef8-a438-18607a5c7da4\" (UID: \"02cc9be0-080a-4ef8-a438-18607a5c7da4\") " Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.972328 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.972900 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g" (OuterVolumeSpecName: "kube-api-access-78n6g") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "kube-api-access-78n6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:36:40 crc kubenswrapper[4739]: I1008 22:36:40.999289 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.002327 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory" (OuterVolumeSpecName: "inventory") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.003082 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.008059 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.011394 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.013196 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.027193 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "02cc9be0-080a-4ef8-a438-18607a5c7da4" (UID: "02cc9be0-080a-4ef8-a438-18607a5c7da4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069357 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78n6g\" (UniqueName: \"kubernetes.io/projected/02cc9be0-080a-4ef8-a438-18607a5c7da4-kube-api-access-78n6g\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069388 4739 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069405 4739 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069417 4739 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069429 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069443 4739 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069454 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069468 4739 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.069479 4739 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/02cc9be0-080a-4ef8-a438-18607a5c7da4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.440327 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" event={"ID":"02cc9be0-080a-4ef8-a438-18607a5c7da4","Type":"ContainerDied","Data":"0bbe6c4288c4294ced634cb1c8fae7acac366c5281435e634380d6feafff5f88"} Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.440367 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bbe6c4288c4294ced634cb1c8fae7acac366c5281435e634380d6feafff5f88" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.440411 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-s2cpg" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566024 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd"] Oct 08 22:36:41 crc kubenswrapper[4739]: E1008 22:36:41.566439 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="extract-utilities" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566456 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="extract-utilities" Oct 08 22:36:41 crc kubenswrapper[4739]: E1008 22:36:41.566473 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="registry-server" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566480 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="registry-server" Oct 08 22:36:41 crc kubenswrapper[4739]: E1008 22:36:41.566490 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="extract-content" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566496 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="extract-content" Oct 08 22:36:41 crc kubenswrapper[4739]: E1008 22:36:41.566506 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cc9be0-080a-4ef8-a438-18607a5c7da4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566514 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cc9be0-080a-4ef8-a438-18607a5c7da4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566698 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e21c40-6ddd-42ad-a970-0aee42b3dfae" containerName="registry-server" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.566729 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cc9be0-080a-4ef8-a438-18607a5c7da4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.567432 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.570048 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.570310 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl5kv" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.570502 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.570849 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.570853 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.604936 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd"] Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.700980 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701030 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701066 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701184 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701217 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8h9p\" (UniqueName: \"kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701658 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.701967 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.803854 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.803925 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.803975 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.804004 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.804050 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.804162 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.804193 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8h9p\" (UniqueName: \"kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.810547 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.811818 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.812015 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.820258 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.820833 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8h9p\" (UniqueName: \"kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.825299 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.825603 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-br4sd\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:41 crc kubenswrapper[4739]: I1008 22:36:41.898528 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:36:42 crc kubenswrapper[4739]: I1008 22:36:42.449137 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd"] Oct 08 22:36:43 crc kubenswrapper[4739]: I1008 22:36:43.465438 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" event={"ID":"76b5c31e-7a34-42b9-9ad1-4f21fc560df3","Type":"ContainerStarted","Data":"2cff2229586c1f5fc2dfbf252f39e09dae770e6409214fafe1111f3dabbf507b"} Oct 08 22:36:43 crc kubenswrapper[4739]: I1008 22:36:43.466042 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" event={"ID":"76b5c31e-7a34-42b9-9ad1-4f21fc560df3","Type":"ContainerStarted","Data":"95f83616c4bc5281e0ce1af4fec35396a6e23788e705e76e0c348ccc554feadc"} Oct 08 22:36:43 crc kubenswrapper[4739]: I1008 22:36:43.495123 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" podStartSLOduration=1.787565437 podStartE2EDuration="2.495096243s" podCreationTimestamp="2025-10-08 22:36:41 +0000 UTC" firstStartedPulling="2025-10-08 22:36:42.45866372 +0000 UTC m=+2902.284049480" lastFinishedPulling="2025-10-08 22:36:43.166194526 +0000 UTC m=+2902.991580286" observedRunningTime="2025-10-08 22:36:43.485246681 +0000 UTC m=+2903.310632471" watchObservedRunningTime="2025-10-08 22:36:43.495096243 +0000 UTC m=+2903.320482003" Oct 08 22:36:51 crc kubenswrapper[4739]: I1008 22:36:51.768711 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:36:51 crc kubenswrapper[4739]: I1008 22:36:51.769573 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:36:51 crc kubenswrapper[4739]: I1008 22:36:51.769646 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:36:51 crc kubenswrapper[4739]: I1008 22:36:51.770956 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:36:51 crc kubenswrapper[4739]: I1008 22:36:51.771069 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90" gracePeriod=600 Oct 08 22:36:52 crc kubenswrapper[4739]: I1008 22:36:52.587447 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90" exitCode=0 Oct 08 22:36:52 crc kubenswrapper[4739]: I1008 22:36:52.587919 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90"} Oct 08 22:36:52 crc kubenswrapper[4739]: I1008 22:36:52.588397 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26"} Oct 08 22:36:52 crc kubenswrapper[4739]: I1008 22:36:52.588455 4739 scope.go:117] "RemoveContainer" containerID="f5309d0370559d2fbd4ae1669f92b84d80aaef9eaebaff357d1aa14a7ff31a5d" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.143633 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.146361 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.154445 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.178771 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8s5n\" (UniqueName: \"kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.178851 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.179039 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.280760 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8s5n\" (UniqueName: \"kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.280833 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.280895 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.281617 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.281743 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.298666 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8s5n\" (UniqueName: \"kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n\") pod \"community-operators-5zjwv\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:11 crc kubenswrapper[4739]: I1008 22:37:11.482639 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:12 crc kubenswrapper[4739]: I1008 22:37:12.057327 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:12 crc kubenswrapper[4739]: I1008 22:37:12.835420 4739 generic.go:334] "Generic (PLEG): container finished" podID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerID="21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273" exitCode=0 Oct 08 22:37:12 crc kubenswrapper[4739]: I1008 22:37:12.835662 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerDied","Data":"21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273"} Oct 08 22:37:12 crc kubenswrapper[4739]: I1008 22:37:12.838106 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:37:12 crc kubenswrapper[4739]: I1008 22:37:12.838108 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerStarted","Data":"0ab930b21fbc7b6175793649a8424ba1fbfa47ea421d8fa21f7e5349a0c11792"} Oct 08 22:37:13 crc kubenswrapper[4739]: I1008 22:37:13.845980 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerStarted","Data":"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92"} Oct 08 22:37:14 crc kubenswrapper[4739]: I1008 22:37:14.857035 4739 generic.go:334] "Generic (PLEG): container finished" podID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerID="7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92" exitCode=0 Oct 08 22:37:14 crc kubenswrapper[4739]: I1008 22:37:14.857078 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerDied","Data":"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92"} Oct 08 22:37:15 crc kubenswrapper[4739]: I1008 22:37:15.868883 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerStarted","Data":"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575"} Oct 08 22:37:15 crc kubenswrapper[4739]: I1008 22:37:15.896387 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zjwv" podStartSLOduration=2.462280348 podStartE2EDuration="4.896367673s" podCreationTimestamp="2025-10-08 22:37:11 +0000 UTC" firstStartedPulling="2025-10-08 22:37:12.837905265 +0000 UTC m=+2932.663291015" lastFinishedPulling="2025-10-08 22:37:15.27199259 +0000 UTC m=+2935.097378340" observedRunningTime="2025-10-08 22:37:15.893781449 +0000 UTC m=+2935.719167189" watchObservedRunningTime="2025-10-08 22:37:15.896367673 +0000 UTC m=+2935.721753443" Oct 08 22:37:21 crc kubenswrapper[4739]: I1008 22:37:21.483474 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:21 crc kubenswrapper[4739]: I1008 22:37:21.484127 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:21 crc kubenswrapper[4739]: I1008 22:37:21.574387 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:21 crc kubenswrapper[4739]: I1008 22:37:21.996356 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:22 crc kubenswrapper[4739]: I1008 22:37:22.062963 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:23 crc kubenswrapper[4739]: I1008 22:37:23.950261 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zjwv" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="registry-server" containerID="cri-o://4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575" gracePeriod=2 Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.352544 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.460286 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content\") pod \"f58e7e86-c39e-4557-b0f8-71e89c97414b\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.460408 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8s5n\" (UniqueName: \"kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n\") pod \"f58e7e86-c39e-4557-b0f8-71e89c97414b\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.460505 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities\") pod \"f58e7e86-c39e-4557-b0f8-71e89c97414b\" (UID: \"f58e7e86-c39e-4557-b0f8-71e89c97414b\") " Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.461176 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities" (OuterVolumeSpecName: "utilities") pod "f58e7e86-c39e-4557-b0f8-71e89c97414b" (UID: "f58e7e86-c39e-4557-b0f8-71e89c97414b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.466595 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n" (OuterVolumeSpecName: "kube-api-access-j8s5n") pod "f58e7e86-c39e-4557-b0f8-71e89c97414b" (UID: "f58e7e86-c39e-4557-b0f8-71e89c97414b"). InnerVolumeSpecName "kube-api-access-j8s5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.522862 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f58e7e86-c39e-4557-b0f8-71e89c97414b" (UID: "f58e7e86-c39e-4557-b0f8-71e89c97414b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.562735 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.562769 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8s5n\" (UniqueName: \"kubernetes.io/projected/f58e7e86-c39e-4557-b0f8-71e89c97414b-kube-api-access-j8s5n\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.562784 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f58e7e86-c39e-4557-b0f8-71e89c97414b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.961632 4739 generic.go:334] "Generic (PLEG): container finished" podID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerID="4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575" exitCode=0 Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.961700 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerDied","Data":"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575"} Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.961739 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zjwv" event={"ID":"f58e7e86-c39e-4557-b0f8-71e89c97414b","Type":"ContainerDied","Data":"0ab930b21fbc7b6175793649a8424ba1fbfa47ea421d8fa21f7e5349a0c11792"} Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.961771 4739 scope.go:117] "RemoveContainer" containerID="4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575" Oct 08 22:37:24 crc kubenswrapper[4739]: I1008 22:37:24.962003 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zjwv" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.001434 4739 scope.go:117] "RemoveContainer" containerID="7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.045947 4739 scope.go:117] "RemoveContainer" containerID="21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.051330 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.069489 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zjwv"] Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.119227 4739 scope.go:117] "RemoveContainer" containerID="4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575" Oct 08 22:37:25 crc kubenswrapper[4739]: E1008 22:37:25.120603 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575\": container with ID starting with 4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575 not found: ID does not exist" containerID="4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.120653 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575"} err="failed to get container status \"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575\": rpc error: code = NotFound desc = could not find container \"4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575\": container with ID starting with 4902a6f2920d305ccbbc6df969661f468dbc5c386cd77d840dcf72dea2b80575 not found: ID does not exist" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.120687 4739 scope.go:117] "RemoveContainer" containerID="7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92" Oct 08 22:37:25 crc kubenswrapper[4739]: E1008 22:37:25.121342 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92\": container with ID starting with 7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92 not found: ID does not exist" containerID="7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.121375 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92"} err="failed to get container status \"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92\": rpc error: code = NotFound desc = could not find container \"7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92\": container with ID starting with 7235d6a9bd309b68f4fddc3a5d8c81237663a0e0482386eb18c0d876f85d1e92 not found: ID does not exist" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.121399 4739 scope.go:117] "RemoveContainer" containerID="21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273" Oct 08 22:37:25 crc kubenswrapper[4739]: E1008 22:37:25.130119 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273\": container with ID starting with 21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273 not found: ID does not exist" containerID="21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.130167 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273"} err="failed to get container status \"21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273\": rpc error: code = NotFound desc = could not find container \"21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273\": container with ID starting with 21d761f707e4f5ed575eadd6db2c4f4e4e737189dd4fe47a0279a2030a60b273 not found: ID does not exist" Oct 08 22:37:25 crc kubenswrapper[4739]: I1008 22:37:25.832805 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" path="/var/lib/kubelet/pods/f58e7e86-c39e-4557-b0f8-71e89c97414b/volumes" Oct 08 22:39:17 crc kubenswrapper[4739]: I1008 22:39:17.280296 4739 generic.go:334] "Generic (PLEG): container finished" podID="76b5c31e-7a34-42b9-9ad1-4f21fc560df3" containerID="2cff2229586c1f5fc2dfbf252f39e09dae770e6409214fafe1111f3dabbf507b" exitCode=0 Oct 08 22:39:17 crc kubenswrapper[4739]: I1008 22:39:17.280412 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" event={"ID":"76b5c31e-7a34-42b9-9ad1-4f21fc560df3","Type":"ContainerDied","Data":"2cff2229586c1f5fc2dfbf252f39e09dae770e6409214fafe1111f3dabbf507b"} Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.764745 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931086 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931269 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931330 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931419 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931468 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931509 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8h9p\" (UniqueName: \"kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.931556 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle\") pod \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\" (UID: \"76b5c31e-7a34-42b9-9ad1-4f21fc560df3\") " Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.937798 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p" (OuterVolumeSpecName: "kube-api-access-j8h9p") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "kube-api-access-j8h9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.938608 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.962478 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.964349 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.967672 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.970386 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:18 crc kubenswrapper[4739]: I1008 22:39:18.980635 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory" (OuterVolumeSpecName: "inventory") pod "76b5c31e-7a34-42b9-9ad1-4f21fc560df3" (UID: "76b5c31e-7a34-42b9-9ad1-4f21fc560df3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034006 4739 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-inventory\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034053 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034079 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034101 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034120 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8h9p\" (UniqueName: \"kubernetes.io/projected/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-kube-api-access-j8h9p\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034165 4739 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.034182 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76b5c31e-7a34-42b9-9ad1-4f21fc560df3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.308528 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" event={"ID":"76b5c31e-7a34-42b9-9ad1-4f21fc560df3","Type":"ContainerDied","Data":"95f83616c4bc5281e0ce1af4fec35396a6e23788e705e76e0c348ccc554feadc"} Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.308979 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95f83616c4bc5281e0ce1af4fec35396a6e23788e705e76e0c348ccc554feadc" Oct 08 22:39:19 crc kubenswrapper[4739]: I1008 22:39:19.308544 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-br4sd" Oct 08 22:39:21 crc kubenswrapper[4739]: I1008 22:39:21.766312 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:39:21 crc kubenswrapper[4739]: I1008 22:39:21.766768 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.416102 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.417339 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="61db13e3-377c-41e4-bc39-bd2314224f6e" containerName="kube-state-metrics" containerID="cri-o://f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c" gracePeriod=30 Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.496076 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.496358 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-notification-agent" containerID="cri-o://c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39" gracePeriod=30 Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.496747 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-central-agent" containerID="cri-o://7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee" gracePeriod=30 Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.496792 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="proxy-httpd" containerID="cri-o://2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b" gracePeriod=30 Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.496823 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="sg-core" containerID="cri-o://759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3" gracePeriod=30 Oct 08 22:39:27 crc kubenswrapper[4739]: I1008 22:39:27.992335 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.043674 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config\") pod \"61db13e3-377c-41e4-bc39-bd2314224f6e\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.043950 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd2p2\" (UniqueName: \"kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2\") pod \"61db13e3-377c-41e4-bc39-bd2314224f6e\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.044065 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs\") pod \"61db13e3-377c-41e4-bc39-bd2314224f6e\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.044105 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle\") pod \"61db13e3-377c-41e4-bc39-bd2314224f6e\" (UID: \"61db13e3-377c-41e4-bc39-bd2314224f6e\") " Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.056582 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2" (OuterVolumeSpecName: "kube-api-access-zd2p2") pod "61db13e3-377c-41e4-bc39-bd2314224f6e" (UID: "61db13e3-377c-41e4-bc39-bd2314224f6e"). InnerVolumeSpecName "kube-api-access-zd2p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.097360 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "61db13e3-377c-41e4-bc39-bd2314224f6e" (UID: "61db13e3-377c-41e4-bc39-bd2314224f6e"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.118344 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61db13e3-377c-41e4-bc39-bd2314224f6e" (UID: "61db13e3-377c-41e4-bc39-bd2314224f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.129977 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "61db13e3-377c-41e4-bc39-bd2314224f6e" (UID: "61db13e3-377c-41e4-bc39-bd2314224f6e"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.146876 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd2p2\" (UniqueName: \"kubernetes.io/projected/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-api-access-zd2p2\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.146920 4739 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.146935 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.146946 4739 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/61db13e3-377c-41e4-bc39-bd2314224f6e-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.430376 4739 generic.go:334] "Generic (PLEG): container finished" podID="61db13e3-377c-41e4-bc39-bd2314224f6e" containerID="f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c" exitCode=2 Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.430481 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"61db13e3-377c-41e4-bc39-bd2314224f6e","Type":"ContainerDied","Data":"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c"} Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.430510 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"61db13e3-377c-41e4-bc39-bd2314224f6e","Type":"ContainerDied","Data":"7794d91ad353a46a7851ae634f1ba79730d64fced2b03ad256911bba7563a815"} Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.430527 4739 scope.go:117] "RemoveContainer" containerID="f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.430652 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445634 4739 generic.go:334] "Generic (PLEG): container finished" podID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerID="7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee" exitCode=0 Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445681 4739 generic.go:334] "Generic (PLEG): container finished" podID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerID="2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b" exitCode=0 Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445689 4739 generic.go:334] "Generic (PLEG): container finished" podID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerID="759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3" exitCode=2 Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445710 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerDied","Data":"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee"} Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445742 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerDied","Data":"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b"} Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.445756 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerDied","Data":"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3"} Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.464691 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.467040 4739 scope.go:117] "RemoveContainer" containerID="f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c" Oct 08 22:39:28 crc kubenswrapper[4739]: E1008 22:39:28.467691 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c\": container with ID starting with f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c not found: ID does not exist" containerID="f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.467724 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c"} err="failed to get container status \"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c\": rpc error: code = NotFound desc = could not find container \"f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c\": container with ID starting with f517a9ff2396dacbe9e5e46cf1d8c250c09bb1061ce93f82dfeaa904b0d1276c not found: ID does not exist" Oct 08 22:39:28 crc kubenswrapper[4739]: I1008 22:39:28.472591 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068087 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x"] Oct 08 22:39:29 crc kubenswrapper[4739]: E1008 22:39:29.068776 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="extract-utilities" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068794 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="extract-utilities" Oct 08 22:39:29 crc kubenswrapper[4739]: E1008 22:39:29.068825 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61db13e3-377c-41e4-bc39-bd2314224f6e" containerName="kube-state-metrics" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068832 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="61db13e3-377c-41e4-bc39-bd2314224f6e" containerName="kube-state-metrics" Oct 08 22:39:29 crc kubenswrapper[4739]: E1008 22:39:29.068846 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b5c31e-7a34-42b9-9ad1-4f21fc560df3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068856 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b5c31e-7a34-42b9-9ad1-4f21fc560df3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 22:39:29 crc kubenswrapper[4739]: E1008 22:39:29.068867 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="registry-server" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068872 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="registry-server" Oct 08 22:39:29 crc kubenswrapper[4739]: E1008 22:39:29.068885 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="extract-content" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.068892 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="extract-content" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.069074 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="61db13e3-377c-41e4-bc39-bd2314224f6e" containerName="kube-state-metrics" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.069095 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58e7e86-c39e-4557-b0f8-71e89c97414b" containerName="registry-server" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.069115 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b5c31e-7a34-42b9-9ad1-4f21fc560df3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.070495 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.075399 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.097525 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x"] Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.178367 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.178493 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxl6s\" (UniqueName: \"kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.178551 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.216255 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.217641 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.220679 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.220851 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.228605 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.280122 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.280208 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvpb\" (UniqueName: \"kubernetes.io/projected/4906d8eb-3c30-4ed9-b199-79fc2a0995ae-kube-api-access-pdvpb\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.280259 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.280355 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxl6s\" (UniqueName: \"kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.280382 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.281467 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.281535 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.300727 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxl6s\" (UniqueName: \"kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.382181 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.382434 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvpb\" (UniqueName: \"kubernetes.io/projected/4906d8eb-3c30-4ed9-b199-79fc2a0995ae-kube-api-access-pdvpb\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.385440 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.385488 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14fb286fb44bbe25199af1e192439683e7f5ea8c9a7d800d64d05ddbfbb34b13/globalmount\"" pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.392888 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.404377 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvpb\" (UniqueName: \"kubernetes.io/projected/4906d8eb-3c30-4ed9-b199-79fc2a0995ae-kube-api-access-pdvpb\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.432715 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86422d18-ba92-4df9-8aa4-be9cc5351985\") pod \"minio\" (UID: \"4906d8eb-3c30-4ed9-b199-79fc2a0995ae\") " pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.535916 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.832868 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61db13e3-377c-41e4-bc39-bd2314224f6e" path="/var/lib/kubelet/pods/61db13e3-377c-41e4-bc39-bd2314224f6e/volumes" Oct 08 22:39:29 crc kubenswrapper[4739]: I1008 22:39:29.861946 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x"] Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.019021 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.140911 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203378 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203520 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rfk7\" (UniqueName: \"kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203621 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203667 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203800 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203852 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203892 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.203980 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs\") pod \"7dd83bb8-a102-4ba9-825a-1cf852094ace\" (UID: \"7dd83bb8-a102-4ba9-825a-1cf852094ace\") " Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.208711 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts" (OuterVolumeSpecName: "scripts") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.209420 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.209602 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.215223 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7" (OuterVolumeSpecName: "kube-api-access-9rfk7") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "kube-api-access-9rfk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.233286 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.257764 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.290585 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305827 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305866 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd83bb8-a102-4ba9-825a-1cf852094ace-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305878 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305887 4739 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305897 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305907 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rfk7\" (UniqueName: \"kubernetes.io/projected/7dd83bb8-a102-4ba9-825a-1cf852094ace-kube-api-access-9rfk7\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.305917 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.327746 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data" (OuterVolumeSpecName: "config-data") pod "7dd83bb8-a102-4ba9-825a-1cf852094ace" (UID: "7dd83bb8-a102-4ba9-825a-1cf852094ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.407549 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd83bb8-a102-4ba9-825a-1cf852094ace-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.484152 4739 generic.go:334] "Generic (PLEG): container finished" podID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerID="c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39" exitCode=0 Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.484245 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerDied","Data":"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39"} Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.484280 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7dd83bb8-a102-4ba9-825a-1cf852094ace","Type":"ContainerDied","Data":"ba145cc426698f3cad2b4a6649887828549d10d6db001bb78465b4c1092870fe"} Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.484298 4739 scope.go:117] "RemoveContainer" containerID="7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.484406 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.487481 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4906d8eb-3c30-4ed9-b199-79fc2a0995ae","Type":"ContainerStarted","Data":"3037e8e8437ab93c95b825f655c98412c969936aea174160f6659ef8584698c3"} Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.490261 4739 generic.go:334] "Generic (PLEG): container finished" podID="722cc9ee-6835-4563-a73d-9312179a7901" containerID="8b1825c05e39d03b8f7e3de6ec4019743329c556f7ec50e07a8262031ed37d85" exitCode=0 Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.490306 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" event={"ID":"722cc9ee-6835-4563-a73d-9312179a7901","Type":"ContainerDied","Data":"8b1825c05e39d03b8f7e3de6ec4019743329c556f7ec50e07a8262031ed37d85"} Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.490350 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" event={"ID":"722cc9ee-6835-4563-a73d-9312179a7901","Type":"ContainerStarted","Data":"64b9b1678fc5d297fb3f77ce45d7918121c00fc342659da34c7ee45221904e1e"} Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.542478 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.543930 4739 scope.go:117] "RemoveContainer" containerID="2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.550724 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.567581 4739 scope.go:117] "RemoveContainer" containerID="759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.685515 4739 scope.go:117] "RemoveContainer" containerID="c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.710060 4739 scope.go:117] "RemoveContainer" containerID="7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee" Oct 08 22:39:30 crc kubenswrapper[4739]: E1008 22:39:30.711101 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee\": container with ID starting with 7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee not found: ID does not exist" containerID="7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.711141 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee"} err="failed to get container status \"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee\": rpc error: code = NotFound desc = could not find container \"7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee\": container with ID starting with 7ae8c4816ada1777da5bec945093b74d43fd7ca99457f8a563e6275c787702ee not found: ID does not exist" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.711191 4739 scope.go:117] "RemoveContainer" containerID="2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b" Oct 08 22:39:30 crc kubenswrapper[4739]: E1008 22:39:30.711820 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b\": container with ID starting with 2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b not found: ID does not exist" containerID="2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.711862 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b"} err="failed to get container status \"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b\": rpc error: code = NotFound desc = could not find container \"2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b\": container with ID starting with 2d90a134032df8e6340556ed06241c73413b7ee1bb96aa326f5e479870dd118b not found: ID does not exist" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.711892 4739 scope.go:117] "RemoveContainer" containerID="759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3" Oct 08 22:39:30 crc kubenswrapper[4739]: E1008 22:39:30.713114 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3\": container with ID starting with 759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3 not found: ID does not exist" containerID="759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.713208 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3"} err="failed to get container status \"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3\": rpc error: code = NotFound desc = could not find container \"759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3\": container with ID starting with 759c582505d64a41c0e926dcd73ddd37b0e8db7fd0145cf7f29ffa3d586860b3 not found: ID does not exist" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.713261 4739 scope.go:117] "RemoveContainer" containerID="c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39" Oct 08 22:39:30 crc kubenswrapper[4739]: E1008 22:39:30.714734 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39\": container with ID starting with c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39 not found: ID does not exist" containerID="c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39" Oct 08 22:39:30 crc kubenswrapper[4739]: I1008 22:39:30.714793 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39"} err="failed to get container status \"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39\": rpc error: code = NotFound desc = could not find container \"c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39\": container with ID starting with c43078c9f8b852a81b9495f9f2fee7a1febe28b674d67b78c8cb3190688e4a39 not found: ID does not exist" Oct 08 22:39:31 crc kubenswrapper[4739]: I1008 22:39:31.838135 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" path="/var/lib/kubelet/pods/7dd83bb8-a102-4ba9-825a-1cf852094ace/volumes" Oct 08 22:39:34 crc kubenswrapper[4739]: I1008 22:39:34.557296 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4906d8eb-3c30-4ed9-b199-79fc2a0995ae","Type":"ContainerStarted","Data":"4ea0c940963b27e05d38e3ad5d48f74e368a23c114138d768e0baa050c1d0809"} Oct 08 22:39:34 crc kubenswrapper[4739]: I1008 22:39:34.560014 4739 generic.go:334] "Generic (PLEG): container finished" podID="722cc9ee-6835-4563-a73d-9312179a7901" containerID="9ae17d3484e5295e20fde200c96cf92b1f9308f78fbd1a2d09861350d703498d" exitCode=0 Oct 08 22:39:34 crc kubenswrapper[4739]: I1008 22:39:34.560074 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" event={"ID":"722cc9ee-6835-4563-a73d-9312179a7901","Type":"ContainerDied","Data":"9ae17d3484e5295e20fde200c96cf92b1f9308f78fbd1a2d09861350d703498d"} Oct 08 22:39:34 crc kubenswrapper[4739]: I1008 22:39:34.581660 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.832744836 podStartE2EDuration="7.581639407s" podCreationTimestamp="2025-10-08 22:39:27 +0000 UTC" firstStartedPulling="2025-10-08 22:39:30.023402114 +0000 UTC m=+3069.848787864" lastFinishedPulling="2025-10-08 22:39:33.772296685 +0000 UTC m=+3073.597682435" observedRunningTime="2025-10-08 22:39:34.574511382 +0000 UTC m=+3074.399897132" watchObservedRunningTime="2025-10-08 22:39:34.581639407 +0000 UTC m=+3074.407025157" Oct 08 22:39:35 crc kubenswrapper[4739]: I1008 22:39:35.575843 4739 generic.go:334] "Generic (PLEG): container finished" podID="722cc9ee-6835-4563-a73d-9312179a7901" containerID="f1e50f4fe083c5aebdedfcc08e4b1acf4a82c35ac525f4fcf1e2894f7ab562b7" exitCode=0 Oct 08 22:39:35 crc kubenswrapper[4739]: I1008 22:39:35.575927 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" event={"ID":"722cc9ee-6835-4563-a73d-9312179a7901","Type":"ContainerDied","Data":"f1e50f4fe083c5aebdedfcc08e4b1acf4a82c35ac525f4fcf1e2894f7ab562b7"} Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.925286 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.935519 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle\") pod \"722cc9ee-6835-4563-a73d-9312179a7901\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.935760 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util\") pod \"722cc9ee-6835-4563-a73d-9312179a7901\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.935809 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxl6s\" (UniqueName: \"kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s\") pod \"722cc9ee-6835-4563-a73d-9312179a7901\" (UID: \"722cc9ee-6835-4563-a73d-9312179a7901\") " Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.937100 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle" (OuterVolumeSpecName: "bundle") pod "722cc9ee-6835-4563-a73d-9312179a7901" (UID: "722cc9ee-6835-4563-a73d-9312179a7901"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.959453 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s" (OuterVolumeSpecName: "kube-api-access-zxl6s") pod "722cc9ee-6835-4563-a73d-9312179a7901" (UID: "722cc9ee-6835-4563-a73d-9312179a7901"). InnerVolumeSpecName "kube-api-access-zxl6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:36 crc kubenswrapper[4739]: I1008 22:39:36.959639 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util" (OuterVolumeSpecName: "util") pod "722cc9ee-6835-4563-a73d-9312179a7901" (UID: "722cc9ee-6835-4563-a73d-9312179a7901"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.038777 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.038830 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxl6s\" (UniqueName: \"kubernetes.io/projected/722cc9ee-6835-4563-a73d-9312179a7901-kube-api-access-zxl6s\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.038852 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/722cc9ee-6835-4563-a73d-9312179a7901-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.601930 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" event={"ID":"722cc9ee-6835-4563-a73d-9312179a7901","Type":"ContainerDied","Data":"64b9b1678fc5d297fb3f77ce45d7918121c00fc342659da34c7ee45221904e1e"} Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.602262 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b9b1678fc5d297fb3f77ce45d7918121c00fc342659da34c7ee45221904e1e" Oct 08 22:39:37 crc kubenswrapper[4739]: I1008 22:39:37.602010 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.717440 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.718302 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" containerName="openstackclient" containerID="cri-o://71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481" gracePeriod=2 Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.732218 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.812833 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818091 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="proxy-httpd" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818122 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="proxy-httpd" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818139 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="extract" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818159 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="extract" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818185 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="sg-core" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818191 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="sg-core" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818203 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-notification-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818210 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-notification-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818231 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="pull" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818237 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="pull" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818248 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" containerName="openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818254 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" containerName="openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818266 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="util" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818274 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="util" Oct 08 22:39:50 crc kubenswrapper[4739]: E1008 22:39:50.818286 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-central-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818291 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-central-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818530 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" containerName="openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.818544 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-central-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.819451 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="722cc9ee-6835-4563-a73d-9312179a7901" containerName="extract" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.819468 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="sg-core" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.819489 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="proxy-httpd" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.819498 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd83bb8-a102-4ba9-825a-1cf852094ace" containerName="ceilometer-notification-agent" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.820268 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.826903 4739 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" podUID="9e972dc2-2718-4dcd-a49a-9d3199e95d61" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.828016 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.933188 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.933264 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.933305 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:50 crc kubenswrapper[4739]: I1008 22:39:50.933324 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7dn\" (UniqueName: \"kubernetes.io/projected/9e972dc2-2718-4dcd-a49a-9d3199e95d61-kube-api-access-4n7dn\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.035159 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.035239 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.035287 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.035303 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7dn\" (UniqueName: \"kubernetes.io/projected/9e972dc2-2718-4dcd-a49a-9d3199e95d61-kube-api-access-4n7dn\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.036727 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.043499 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.058053 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9e972dc2-2718-4dcd-a49a-9d3199e95d61-openstack-config-secret\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.061588 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7dn\" (UniqueName: \"kubernetes.io/projected/9e972dc2-2718-4dcd-a49a-9d3199e95d61-kube-api-access-4n7dn\") pod \"openstackclient\" (UID: \"9e972dc2-2718-4dcd-a49a-9d3199e95d61\") " pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.156247 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.230240 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c"] Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.232410 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.239060 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c"] Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.239372 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.345295 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq9n\" (UniqueName: \"kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.345373 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.345509 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.448177 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq9n\" (UniqueName: \"kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.448603 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.448634 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.449318 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.449634 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.483123 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq9n\" (UniqueName: \"kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.587405 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.770034 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.770360 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.839129 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.841583 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.845609 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.845973 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-m9l58" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.846188 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.846255 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.850118 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.965984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.966073 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.966114 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.966137 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.966198 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.966256 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq8tr\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-kube-api-access-hq8tr\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:51 crc kubenswrapper[4739]: I1008 22:39:51.997781 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.074813 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq8tr\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-kube-api-access-hq8tr\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.075246 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.075759 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.075809 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.075835 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.075876 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.076218 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.083034 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.083685 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.084055 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.086136 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.124043 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq8tr\" (UniqueName: \"kubernetes.io/projected/45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a-kube-api-access-hq8tr\") pod \"alertmanager-metric-storage-0\" (UID: \"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a\") " pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.204592 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.289053 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c"] Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.491954 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.494158 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502432 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502783 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502838 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502982 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502997 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.502792 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.503823 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-mb4bz" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.614846 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615497 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615554 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615588 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615659 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd75\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-kube-api-access-7gd75\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615689 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615785 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0bc93384-c08a-4c7f-9dc4-318126297a8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.615861 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bc93384-c08a-4c7f-9dc4-318126297a8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717239 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717296 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0bc93384-c08a-4c7f-9dc4-318126297a8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717336 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bc93384-c08a-4c7f-9dc4-318126297a8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717376 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717433 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717460 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717481 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.717514 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd75\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-kube-api-access-7gd75\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.719057 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0bc93384-c08a-4c7f-9dc4-318126297a8b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.727276 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.746471 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.746512 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0bc93384-c08a-4c7f-9dc4-318126297a8b-config\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.746988 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.747009 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/67aee272b80ac6349671e3607ee9d645e793d5fce4d087b410c30bf432366403/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.751943 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.754292 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bc93384-c08a-4c7f-9dc4-318126297a8b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.755133 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd75\" (UniqueName: \"kubernetes.io/projected/0bc93384-c08a-4c7f-9dc4-318126297a8b-kube-api-access-7gd75\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.804456 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9e972dc2-2718-4dcd-a49a-9d3199e95d61","Type":"ContainerStarted","Data":"2f1490149badc7c955c2da37eaff7554cd64932c2927f8f638690b5002ecd5e1"} Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.804500 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9e972dc2-2718-4dcd-a49a-9d3199e95d61","Type":"ContainerStarted","Data":"03bf20deeb9082724ecade1f39a8ba910d1b58e0c01a383e3c9221a7f876c35c"} Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.815476 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerStarted","Data":"ed4097dd2f3e20ee22cdaa330970881d22cacc66c2ea2f04cb11a47f68c1b629"} Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.815517 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerStarted","Data":"89627debf2a75dccbf03f9aa3da9a399b14a6f9a6eedc306bc13dbaaf2d6f37e"} Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.857600 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.857576384 podStartE2EDuration="2.857576384s" podCreationTimestamp="2025-10-08 22:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:39:52.848686026 +0000 UTC m=+3092.674071776" watchObservedRunningTime="2025-10-08 22:39:52.857576384 +0000 UTC m=+3092.682962134" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.877370 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb1dad59-bba6-404f-ac41-0184d025fc3c\") pod \"prometheus-metric-storage-0\" (UID: \"0bc93384-c08a-4c7f-9dc4-318126297a8b\") " pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:52 crc kubenswrapper[4739]: I1008 22:39:52.928211 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.175564 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.301684 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.303623 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.316428 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.316561 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.316647 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.316747 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.339985 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.340484 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.340731 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-hs7fp" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.401630 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.406452 4739 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" podUID="9e972dc2-2718-4dcd-a49a-9d3199e95d61" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.496471 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-webhook-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.496545 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.496726 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsvs\" (UniqueName: \"kubernetes.io/projected/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-kube-api-access-dlsvs\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.496748 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-apiservice-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.496802 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-manager-config\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.599414 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle\") pod \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.599861 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config\") pod \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600311 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fqt5\" (UniqueName: \"kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5\") pod \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600364 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret\") pod \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\" (UID: \"37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5\") " Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600726 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600893 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsvs\" (UniqueName: \"kubernetes.io/projected/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-kube-api-access-dlsvs\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600929 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-apiservice-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.600973 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-manager-config\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.601031 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-webhook-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.602895 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-manager-config\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.616312 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-apiservice-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.616501 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5" (OuterVolumeSpecName: "kube-api-access-5fqt5") pod "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" (UID: "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5"). InnerVolumeSpecName "kube-api-access-5fqt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.632435 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.651607 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsvs\" (UniqueName: \"kubernetes.io/projected/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-kube-api-access-dlsvs\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.692052 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab6cc895-0aa3-49a5-bec3-38efa4dd348f-webhook-cert\") pod \"loki-operator-controller-manager-649bd47b54-74dfs\" (UID: \"ab6cc895-0aa3-49a5-bec3-38efa4dd348f\") " pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.702956 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fqt5\" (UniqueName: \"kubernetes.io/projected/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-kube-api-access-5fqt5\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.704384 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" (UID: "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.722444 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" (UID: "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.725210 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" (UID: "37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.738784 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.804250 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.804280 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.804291 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.815714 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-6svpl"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.825363 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.863086 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" path="/var/lib/kubelet/pods/37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5/volumes" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.864065 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6svpl"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.876891 4739 generic.go:334] "Generic (PLEG): container finished" podID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" containerID="71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481" exitCode=137 Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.877001 4739 scope.go:117] "RemoveContainer" containerID="71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.877126 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.882296 4739 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" podUID="9e972dc2-2718-4dcd-a49a-9d3199e95d61" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.897252 4739 generic.go:334] "Generic (PLEG): container finished" podID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerID="ed4097dd2f3e20ee22cdaa330970881d22cacc66c2ea2f04cb11a47f68c1b629" exitCode=0 Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.897336 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerDied","Data":"ed4097dd2f3e20ee22cdaa330970881d22cacc66c2ea2f04cb11a47f68c1b629"} Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.897689 4739 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="37d7ddcb-9e6e-4ec1-b97c-ff1e7b6fb6b5" podUID="9e972dc2-2718-4dcd-a49a-9d3199e95d61" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.899640 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a","Type":"ContainerStarted","Data":"ead467d1fce34e734a0780506f5e7bed9f1580d40161b0bdfa6fab43816347f3"} Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.907104 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.923426 4739 scope.go:117] "RemoveContainer" containerID="71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481" Oct 08 22:39:53 crc kubenswrapper[4739]: E1008 22:39:53.926696 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481\": container with ID starting with 71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481 not found: ID does not exist" containerID="71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481" Oct 08 22:39:53 crc kubenswrapper[4739]: I1008 22:39:53.926741 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481"} err="failed to get container status \"71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481\": rpc error: code = NotFound desc = could not find container \"71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481\": container with ID starting with 71929e391e11a4120e9c99e893d34aa741fff3481c7a506856ced6d75d507481 not found: ID does not exist" Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.007434 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4v4s\" (UniqueName: \"kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s\") pod \"cloudkitty-db-create-6svpl\" (UID: \"b2820d1f-6f24-4b4b-bde2-6474a45fdbff\") " pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.108689 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4v4s\" (UniqueName: \"kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s\") pod \"cloudkitty-db-create-6svpl\" (UID: \"b2820d1f-6f24-4b4b-bde2-6474a45fdbff\") " pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.156252 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4v4s\" (UniqueName: \"kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s\") pod \"cloudkitty-db-create-6svpl\" (UID: \"b2820d1f-6f24-4b4b-bde2-6474a45fdbff\") " pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.162297 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.302797 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs"] Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.596181 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6svpl"] Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.923200 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerStarted","Data":"f2b4b4fe8577ff95cb97dc8e6f469d54c87060c977d7fe599e8de8187c7da080"} Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.925008 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6svpl" event={"ID":"b2820d1f-6f24-4b4b-bde2-6474a45fdbff","Type":"ContainerStarted","Data":"6ffd8f1b5a09be508d12a9646598dccf97d81b460cfca0688e96f57ecc0ba96e"} Oct 08 22:39:54 crc kubenswrapper[4739]: I1008 22:39:54.928185 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" event={"ID":"ab6cc895-0aa3-49a5-bec3-38efa4dd348f","Type":"ContainerStarted","Data":"cfe9ff44b57fae54fdea10bc1e58cfa91ce6bc1c3bb4c3d5faecc15d1756c93e"} Oct 08 22:39:55 crc kubenswrapper[4739]: I1008 22:39:55.943281 4739 generic.go:334] "Generic (PLEG): container finished" podID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerID="f180b6937151bb79b1164d476402d686df9f14e9522ab7a6b9ef137bee1cb0a8" exitCode=0 Oct 08 22:39:55 crc kubenswrapper[4739]: I1008 22:39:55.943877 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerDied","Data":"f180b6937151bb79b1164d476402d686df9f14e9522ab7a6b9ef137bee1cb0a8"} Oct 08 22:39:55 crc kubenswrapper[4739]: I1008 22:39:55.950933 4739 generic.go:334] "Generic (PLEG): container finished" podID="b2820d1f-6f24-4b4b-bde2-6474a45fdbff" containerID="1687f0c5c9ebf819674a04069568e88f62ea5dd74b1a42fec37aa5cf9c2a9156" exitCode=0 Oct 08 22:39:55 crc kubenswrapper[4739]: I1008 22:39:55.950988 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6svpl" event={"ID":"b2820d1f-6f24-4b4b-bde2-6474a45fdbff","Type":"ContainerDied","Data":"1687f0c5c9ebf819674a04069568e88f62ea5dd74b1a42fec37aa5cf9c2a9156"} Oct 08 22:39:56 crc kubenswrapper[4739]: I1008 22:39:56.965965 4739 generic.go:334] "Generic (PLEG): container finished" podID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerID="4fa5bceab945dfd65c38444e0fbcaec0935c117e4fd63e39a47896f86a754f2c" exitCode=0 Oct 08 22:39:56 crc kubenswrapper[4739]: I1008 22:39:56.966194 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerDied","Data":"4fa5bceab945dfd65c38444e0fbcaec0935c117e4fd63e39a47896f86a754f2c"} Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.380880 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.404619 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4v4s\" (UniqueName: \"kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s\") pod \"b2820d1f-6f24-4b4b-bde2-6474a45fdbff\" (UID: \"b2820d1f-6f24-4b4b-bde2-6474a45fdbff\") " Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.411757 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s" (OuterVolumeSpecName: "kube-api-access-c4v4s") pod "b2820d1f-6f24-4b4b-bde2-6474a45fdbff" (UID: "b2820d1f-6f24-4b4b-bde2-6474a45fdbff"). InnerVolumeSpecName "kube-api-access-c4v4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.506205 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4v4s\" (UniqueName: \"kubernetes.io/projected/b2820d1f-6f24-4b4b-bde2-6474a45fdbff-kube-api-access-c4v4s\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.978815 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6svpl" Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.978874 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6svpl" event={"ID":"b2820d1f-6f24-4b4b-bde2-6474a45fdbff","Type":"ContainerDied","Data":"6ffd8f1b5a09be508d12a9646598dccf97d81b460cfca0688e96f57ecc0ba96e"} Oct 08 22:39:57 crc kubenswrapper[4739]: I1008 22:39:57.978906 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ffd8f1b5a09be508d12a9646598dccf97d81b460cfca0688e96f57ecc0ba96e" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.303218 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:39:58 crc kubenswrapper[4739]: E1008 22:39:58.303976 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2820d1f-6f24-4b4b-bde2-6474a45fdbff" containerName="mariadb-database-create" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.303993 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2820d1f-6f24-4b4b-bde2-6474a45fdbff" containerName="mariadb-database-create" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.304220 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2820d1f-6f24-4b4b-bde2-6474a45fdbff" containerName="mariadb-database-create" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.306383 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.310705 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.311117 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jcpb9" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.312534 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.322304 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.328576 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvbnm\" (UniqueName: \"kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.328765 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.328813 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.328918 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.328982 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.329010 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431138 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431265 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431288 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431336 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbnm\" (UniqueName: \"kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431394 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.431420 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.432610 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.433354 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.640872 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.641072 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.641432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.641699 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbnm\" (UniqueName: \"kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm\") pod \"ceilometer-0\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " pod="openstack/ceilometer-0" Oct 08 22:39:58 crc kubenswrapper[4739]: I1008 22:39:58.933082 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:40:00 crc kubenswrapper[4739]: I1008 22:40:00.059749 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a","Type":"ContainerStarted","Data":"b0efd87f05c47e62691eca3738c78be32141061505b4488eccede158948ef02e"} Oct 08 22:40:00 crc kubenswrapper[4739]: I1008 22:40:00.063912 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerStarted","Data":"3f9cf86a7d44ddcbdaf8a2818d489388644ffe5cdbdc111cda6783c21f3a572b"} Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.509891 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.679466 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util\") pod \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.679698 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwq9n\" (UniqueName: \"kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n\") pod \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.679792 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle\") pod \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\" (UID: \"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54\") " Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.681004 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle" (OuterVolumeSpecName: "bundle") pod "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" (UID: "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.688818 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n" (OuterVolumeSpecName: "kube-api-access-fwq9n") pod "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" (UID: "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54"). InnerVolumeSpecName "kube-api-access-fwq9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.690961 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util" (OuterVolumeSpecName: "util") pod "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" (UID: "ebd8cd89-f73e-48cc-99b9-59f14f0d9d54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.785865 4739 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.786249 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwq9n\" (UniqueName: \"kubernetes.io/projected/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-kube-api-access-fwq9n\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:00.786264 4739 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ebd8cd89-f73e-48cc-99b9-59f14f0d9d54-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:01.080299 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" event={"ID":"ab6cc895-0aa3-49a5-bec3-38efa4dd348f","Type":"ContainerStarted","Data":"3fffd9e4b6bf75085129b8c36e6af7b14696b3dd601caef4f1344f25f9781cae"} Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:01.083203 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:01.083302 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c" event={"ID":"ebd8cd89-f73e-48cc-99b9-59f14f0d9d54","Type":"ContainerDied","Data":"89627debf2a75dccbf03f9aa3da9a399b14a6f9a6eedc306bc13dbaaf2d6f37e"} Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:01.083324 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89627debf2a75dccbf03f9aa3da9a399b14a6f9a6eedc306bc13dbaaf2d6f37e" Oct 08 22:40:01 crc kubenswrapper[4739]: I1008 22:40:01.678834 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:40:02 crc kubenswrapper[4739]: I1008 22:40:02.094059 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerStarted","Data":"1c010394828d817b22e7fa8ddaf1cea24393a8a4dee70fe4f1ed287eec8cc08f"} Oct 08 22:40:03 crc kubenswrapper[4739]: I1008 22:40:03.107716 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerStarted","Data":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} Oct 08 22:40:05 crc kubenswrapper[4739]: I1008 22:40:05.138316 4739 generic.go:334] "Generic (PLEG): container finished" podID="45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a" containerID="b0efd87f05c47e62691eca3738c78be32141061505b4488eccede158948ef02e" exitCode=0 Oct 08 22:40:05 crc kubenswrapper[4739]: I1008 22:40:05.138504 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a","Type":"ContainerDied","Data":"b0efd87f05c47e62691eca3738c78be32141061505b4488eccede158948ef02e"} Oct 08 22:40:07 crc kubenswrapper[4739]: I1008 22:40:07.164103 4739 generic.go:334] "Generic (PLEG): container finished" podID="0bc93384-c08a-4c7f-9dc4-318126297a8b" containerID="3f9cf86a7d44ddcbdaf8a2818d489388644ffe5cdbdc111cda6783c21f3a572b" exitCode=0 Oct 08 22:40:07 crc kubenswrapper[4739]: I1008 22:40:07.164359 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerDied","Data":"3f9cf86a7d44ddcbdaf8a2818d489388644ffe5cdbdc111cda6783c21f3a572b"} Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.549677 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl"] Oct 08 22:40:08 crc kubenswrapper[4739]: E1008 22:40:08.550518 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="pull" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.550532 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="pull" Oct 08 22:40:08 crc kubenswrapper[4739]: E1008 22:40:08.550567 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="util" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.550573 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="util" Oct 08 22:40:08 crc kubenswrapper[4739]: E1008 22:40:08.550582 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="extract" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.550588 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="extract" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.550784 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd8cd89-f73e-48cc-99b9-59f14f0d9d54" containerName="extract" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.551508 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.562199 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.575675 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-mczs6" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.575702 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.575945 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.576520 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.576664 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.650761 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb89w\" (UniqueName: \"kubernetes.io/projected/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-kube-api-access-mb89w\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.650843 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.650921 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.651008 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.651109 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.706918 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.708653 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.711122 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.711787 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.711942 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-loki-s3" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.737131 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753227 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753331 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753403 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753436 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753512 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753614 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753693 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpdb\" (UniqueName: \"kubernetes.io/projected/f8016740-3857-4c88-81a3-6ee47b7e2a75-kube-api-access-kmpdb\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753756 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb89w\" (UniqueName: \"kubernetes.io/projected/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-kube-api-access-mb89w\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753788 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753826 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.753881 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.754388 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.754924 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.768574 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.772237 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb89w\" (UniqueName: \"kubernetes.io/projected/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-kube-api-access-mb89w\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.776503 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/cd624632-67d1-48e1-8c43-fa58f5d2e5ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-5dccl\" (UID: \"cd624632-67d1-48e1-8c43-fa58f5d2e5ea\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.846661 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855379 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855454 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpdb\" (UniqueName: \"kubernetes.io/projected/f8016740-3857-4c88-81a3-6ee47b7e2a75-kube-api-access-kmpdb\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855481 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855531 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855570 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.855611 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.856516 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.858878 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.862741 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8016740-3857-4c88-81a3-6ee47b7e2a75-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.863273 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.863596 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.867798 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.871045 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.874127 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.875849 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/f8016740-3857-4c88-81a3-6ee47b7e2a75-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.894629 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.896667 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpdb\" (UniqueName: \"kubernetes.io/projected/f8016740-3857-4c88-81a3-6ee47b7e2a75-kube-api-access-kmpdb\") pod \"cloudkitty-lokistack-querier-68bbd7984c-65fx4\" (UID: \"f8016740-3857-4c88-81a3-6ee47b7e2a75\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.958363 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.958464 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.958501 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8dd6\" (UniqueName: \"kubernetes.io/projected/137f65de-3030-4c4b-a087-c547dd183105-kube-api-access-d8dd6\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.958539 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.958596 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.996731 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz"] Oct 08 22:40:08 crc kubenswrapper[4739]: I1008 22:40:08.999443 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.004737 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.004965 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.005086 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.005219 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.005377 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.005516 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.008345 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-2l8mb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.015084 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.027005 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.060942 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061030 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8dd6\" (UniqueName: \"kubernetes.io/projected/137f65de-3030-4c4b-a087-c547dd183105-kube-api-access-d8dd6\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061069 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061311 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061433 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061513 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061572 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061711 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061777 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.061920 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062005 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062062 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062210 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwgw4\" (UniqueName: \"kubernetes.io/projected/4b1ae118-cd96-4e60-997d-9594acff7531-kube-api-access-wwgw4\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062333 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062451 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.062966 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.071696 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.074678 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.080283 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.087038 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/137f65de-3030-4c4b-a087-c547dd183105-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.103883 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8dd6\" (UniqueName: \"kubernetes.io/projected/137f65de-3030-4c4b-a087-c547dd183105-kube-api-access-d8dd6\") pod \"cloudkitty-lokistack-query-frontend-779849886d-v9qwb\" (UID: \"137f65de-3030-4c4b-a087-c547dd183105\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.112589 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174299 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174681 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174740 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174819 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174878 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174915 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174960 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.174989 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h44xj\" (UniqueName: \"kubernetes.io/projected/0663f463-0160-4cc2-bad3-389baee708da-kube-api-access-h44xj\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175008 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175040 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwgw4\" (UniqueName: \"kubernetes.io/projected/4b1ae118-cd96-4e60-997d-9594acff7531-kube-api-access-wwgw4\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175077 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175123 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175163 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175206 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175234 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175287 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175336 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.175360 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.177161 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: E1008 22:40:09.177366 4739 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Oct 08 22:40:09 crc kubenswrapper[4739]: E1008 22:40:09.177407 4739 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret podName:4b1ae118-cd96-4e60-997d-9594acff7531 nodeName:}" failed. No retries permitted until 2025-10-08 22:40:09.677395309 +0000 UTC m=+3109.502781049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret") pod "cloudkitty-lokistack-gateway-76cc998948-czjfz" (UID: "4b1ae118-cd96-4e60-997d-9594acff7531") : secret "cloudkitty-lokistack-gateway-http" not found Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.178538 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.179059 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.179096 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.179744 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.180585 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.181272 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b1ae118-cd96-4e60-997d-9594acff7531-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.211870 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwgw4\" (UniqueName: \"kubernetes.io/projected/4b1ae118-cd96-4e60-997d-9594acff7531-kube-api-access-wwgw4\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.265115 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277638 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277708 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277743 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h44xj\" (UniqueName: \"kubernetes.io/projected/0663f463-0160-4cc2-bad3-389baee708da-kube-api-access-h44xj\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277768 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277818 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277873 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277927 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.277983 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.278064 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.372654 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.525303 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.525812 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.525986 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.525999 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.526362 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0663f463-0160-4cc2-bad3-389baee708da-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.526682 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.530939 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0663f463-0160-4cc2-bad3-389baee708da-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.530983 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h44xj\" (UniqueName: \"kubernetes.io/projected/0663f463-0160-4cc2-bad3-389baee708da-kube-api-access-h44xj\") pod \"cloudkitty-lokistack-gateway-76cc998948-nf2wl\" (UID: \"0663f463-0160-4cc2-bad3-389baee708da\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.690514 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.690839 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.691950 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.697246 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b1ae118-cd96-4e60-997d-9594acff7531-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-czjfz\" (UID: \"4b1ae118-cd96-4e60-997d-9594acff7531\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.697497 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.697755 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.710066 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.734702 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792107 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a613c597-5598-414c-a589-80b245a81ca2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a613c597-5598-414c-a589-80b245a81ca2\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792189 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792226 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792261 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8763d1ad-00b3-48c6-8306-1406528d961a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8763d1ad-00b3-48c6-8306-1406528d961a\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792293 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792337 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792392 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.792412 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zv99\" (UniqueName: \"kubernetes.io/projected/9d41d47c-0875-4283-908d-559995e5069e-kube-api-access-2zv99\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.803065 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.804462 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.806569 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.808057 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.816140 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.884849 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.886184 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.888080 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.888640 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894659 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894725 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894750 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894772 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zv99\" (UniqueName: \"kubernetes.io/projected/9d41d47c-0875-4283-908d-559995e5069e-kube-api-access-2zv99\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894811 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a613c597-5598-414c-a589-80b245a81ca2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a613c597-5598-414c-a589-80b245a81ca2\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894844 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894864 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894895 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894915 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894948 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8763d1ad-00b3-48c6-8306-1406528d961a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8763d1ad-00b3-48c6-8306-1406528d961a\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.894969 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.895002 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.895033 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5mgq\" (UniqueName: \"kubernetes.io/projected/02de38f3-1c70-4314-8dd7-4b5612c4348f-kube-api-access-k5mgq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.895065 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.895091 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.898568 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.898735 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d41d47c-0875-4283-908d-559995e5069e-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.902202 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.902937 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.903454 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/9d41d47c-0875-4283-908d-559995e5069e-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.920202 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.920973 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zv99\" (UniqueName: \"kubernetes.io/projected/9d41d47c-0875-4283-908d-559995e5069e-kube-api-access-2zv99\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.928728 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.928780 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a613c597-5598-414c-a589-80b245a81ca2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a613c597-5598-414c-a589-80b245a81ca2\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dea59b7f51b3b26ff65f361921458bbcb84463a5c3c0ad1e8bf48af8c65804ec/globalmount\"" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.931677 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.931725 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8763d1ad-00b3-48c6-8306-1406528d961a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8763d1ad-00b3-48c6-8306-1406528d961a\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb79052feaf135c72cbe3af74625e5b2e2df38d8a0c7d66599f6e89193a752eb/globalmount\"" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:09 crc kubenswrapper[4739]: I1008 22:40:09.969211 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.001437 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.001652 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.001727 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.001816 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5mgq\" (UniqueName: \"kubernetes.io/projected/02de38f3-1c70-4314-8dd7-4b5612c4348f-kube-api-access-k5mgq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.001952 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.002002 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.002139 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.004912 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.009658 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a613c597-5598-414c-a589-80b245a81ca2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a613c597-5598-414c-a589-80b245a81ca2\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.010474 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02de38f3-1c70-4314-8dd7-4b5612c4348f-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.013503 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.018340 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.022962 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/02de38f3-1c70-4314-8dd7-4b5612c4348f-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.028171 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.028310 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/35e457891f48da447059b215f9dc4d4251ce06a80d5c9bd00da897262622c3e1/globalmount\"" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.041127 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5mgq\" (UniqueName: \"kubernetes.io/projected/02de38f3-1c70-4314-8dd7-4b5612c4348f-kube-api-access-k5mgq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.072641 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8763d1ad-00b3-48c6-8306-1406528d961a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8763d1ad-00b3-48c6-8306-1406528d961a\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"9d41d47c-0875-4283-908d-559995e5069e\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.088592 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fee7565-6c5a-4eea-bc1a-6b0c10ae2263\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"02de38f3-1c70-4314-8dd7-4b5612c4348f\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.111521 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.111596 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.111646 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkf8p\" (UniqueName: \"kubernetes.io/projected/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-kube-api-access-rkf8p\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.111823 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.112016 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.112240 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.112323 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4242a073-d629-4599-9df4-e18ca4490c07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4242a073-d629-4599-9df4-e18ca4490c07\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.168761 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.219395 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.219915 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.219973 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.220008 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4242a073-d629-4599-9df4-e18ca4490c07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4242a073-d629-4599-9df4-e18ca4490c07\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.220051 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.220074 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.220109 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkf8p\" (UniqueName: \"kubernetes.io/projected/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-kube-api-access-rkf8p\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.235759 4739 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.235807 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4242a073-d629-4599-9df4-e18ca4490c07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4242a073-d629-4599-9df4-e18ca4490c07\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1993440bc7fe367775917d0f5b8ebc3a1f2a0de84a08e06cae36116553cb34f/globalmount\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.311291 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.570642 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.571066 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.575331 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.575398 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkf8p\" (UniqueName: \"kubernetes.io/projected/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-kube-api-access-rkf8p\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.584510 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.585179 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.607566 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4242a073-d629-4599-9df4-e18ca4490c07\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4242a073-d629-4599-9df4-e18ca4490c07\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:10 crc kubenswrapper[4739]: I1008 22:40:10.701365 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:15 crc kubenswrapper[4739]: I1008 22:40:15.502289 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb"] Oct 08 22:40:15 crc kubenswrapper[4739]: W1008 22:40:15.529891 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137f65de_3030_4c4b_a087_c547dd183105.slice/crio-b03812f3b622767e4c549eed2ef5d1131a70a64b09fe1f17e7c147626b0c8811 WatchSource:0}: Error finding container b03812f3b622767e4c549eed2ef5d1131a70a64b09fe1f17e7c147626b0c8811: Status 404 returned error can't find the container with id b03812f3b622767e4c549eed2ef5d1131a70a64b09fe1f17e7c147626b0c8811 Oct 08 22:40:15 crc kubenswrapper[4739]: I1008 22:40:15.941504 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl"] Oct 08 22:40:15 crc kubenswrapper[4739]: I1008 22:40:15.959697 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz"] Oct 08 22:40:15 crc kubenswrapper[4739]: I1008 22:40:15.976411 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 08 22:40:15 crc kubenswrapper[4739]: W1008 22:40:15.986871 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d41d47c_0875_4283_908d_559995e5069e.slice/crio-7f92e3e755775096ca2ade264cb3440be5e265145d20d9eacf6fbcb7b093d2f1 WatchSource:0}: Error finding container 7f92e3e755775096ca2ade264cb3440be5e265145d20d9eacf6fbcb7b093d2f1: Status 404 returned error can't find the container with id 7f92e3e755775096ca2ade264cb3440be5e265145d20d9eacf6fbcb7b093d2f1 Oct 08 22:40:15 crc kubenswrapper[4739]: I1008 22:40:15.992533 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 08 22:40:16 crc kubenswrapper[4739]: W1008 22:40:16.002485 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8016740_3857_4c88_81a3_6ee47b7e2a75.slice/crio-e79562ee503a2bd72f428300745566e097a738540b2b64c252c8cff56f5a960b WatchSource:0}: Error finding container e79562ee503a2bd72f428300745566e097a738540b2b64c252c8cff56f5a960b: Status 404 returned error can't find the container with id e79562ee503a2bd72f428300745566e097a738540b2b64c252c8cff56f5a960b Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.003262 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4"] Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.027445 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.094331 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl"] Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.330258 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"02de38f3-1c70-4314-8dd7-4b5612c4348f","Type":"ContainerStarted","Data":"6d48d7b4571d55eda23995d1c7142ae237a062aaf30780997c4ae061468d80c2"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.389547 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" event={"ID":"cd624632-67d1-48e1-8c43-fa58f5d2e5ea","Type":"ContainerStarted","Data":"a00d65eac2c0a89ce7814d978c8ddfe3a23d4bda28ebc5a8a1bd3ad553698c7c"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.444612 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" event={"ID":"ab6cc895-0aa3-49a5-bec3-38efa4dd348f","Type":"ContainerStarted","Data":"8aae638b4f9c3b2a2792787bc71b7c9751c2e7a4f0ef689821df99ee471365ee"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.446598 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.465887 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.472320 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerStarted","Data":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.483004 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4","Type":"ContainerStarted","Data":"4678180e03f317aa9cd07932b81076eed5436d9ce1be56396ae5359008d7ea2f"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.484440 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-649bd47b54-74dfs" podStartSLOduration=2.800630708 podStartE2EDuration="23.484419737s" podCreationTimestamp="2025-10-08 22:39:53 +0000 UTC" firstStartedPulling="2025-10-08 22:39:54.337683263 +0000 UTC m=+3094.163069013" lastFinishedPulling="2025-10-08 22:40:15.021472292 +0000 UTC m=+3114.846858042" observedRunningTime="2025-10-08 22:40:16.483550956 +0000 UTC m=+3116.308936706" watchObservedRunningTime="2025-10-08 22:40:16.484419737 +0000 UTC m=+3116.309805487" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.486037 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a","Type":"ContainerStarted","Data":"bb65f17fd777ef30804600b2972f3fea1c71bc513a8dc3cebcbd1d05636ec3f0"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.491999 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"9d41d47c-0875-4283-908d-559995e5069e","Type":"ContainerStarted","Data":"7f92e3e755775096ca2ade264cb3440be5e265145d20d9eacf6fbcb7b093d2f1"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.499226 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" event={"ID":"0663f463-0160-4cc2-bad3-389baee708da","Type":"ContainerStarted","Data":"63e776ffbd4602d1ec1ab4ef3ade0c88194640c7e3c0d30a554d40fada544b93"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.500560 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" event={"ID":"f8016740-3857-4c88-81a3-6ee47b7e2a75","Type":"ContainerStarted","Data":"e79562ee503a2bd72f428300745566e097a738540b2b64c252c8cff56f5a960b"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.501688 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" event={"ID":"137f65de-3030-4c4b-a087-c547dd183105","Type":"ContainerStarted","Data":"b03812f3b622767e4c549eed2ef5d1131a70a64b09fe1f17e7c147626b0c8811"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.503293 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" event={"ID":"4b1ae118-cd96-4e60-997d-9594acff7531","Type":"ContainerStarted","Data":"82c461eccf012506071d33957fcd299f1cf82ca76a763cf838ffafa488febac6"} Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.733174 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-bcf2-account-create-ssjjf"] Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.736836 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.738895 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.759287 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-bcf2-account-create-ssjjf"] Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.875372 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jddsp\" (UniqueName: \"kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp\") pod \"cloudkitty-bcf2-account-create-ssjjf\" (UID: \"02b36e40-d0aa-4580-b3af-ca16a64477d4\") " pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:16 crc kubenswrapper[4739]: I1008 22:40:16.977853 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jddsp\" (UniqueName: \"kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp\") pod \"cloudkitty-bcf2-account-create-ssjjf\" (UID: \"02b36e40-d0aa-4580-b3af-ca16a64477d4\") " pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:17 crc kubenswrapper[4739]: I1008 22:40:17.005023 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jddsp\" (UniqueName: \"kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp\") pod \"cloudkitty-bcf2-account-create-ssjjf\" (UID: \"02b36e40-d0aa-4580-b3af-ca16a64477d4\") " pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:17 crc kubenswrapper[4739]: I1008 22:40:17.055105 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:17 crc kubenswrapper[4739]: I1008 22:40:17.567985 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerStarted","Data":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} Oct 08 22:40:17 crc kubenswrapper[4739]: I1008 22:40:17.616784 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-bcf2-account-create-ssjjf"] Oct 08 22:40:17 crc kubenswrapper[4739]: W1008 22:40:17.632006 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02b36e40_d0aa_4580_b3af_ca16a64477d4.slice/crio-6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206 WatchSource:0}: Error finding container 6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206: Status 404 returned error can't find the container with id 6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206 Oct 08 22:40:18 crc kubenswrapper[4739]: I1008 22:40:18.603974 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" event={"ID":"02b36e40-d0aa-4580-b3af-ca16a64477d4","Type":"ContainerStarted","Data":"6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206"} Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.621499 4739 generic.go:334] "Generic (PLEG): container finished" podID="02b36e40-d0aa-4580-b3af-ca16a64477d4" containerID="299c25e643c0b91c5f4c3824f5fbfa4a7d6df8eb6235330680f74f9eece4daaa" exitCode=0 Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.621555 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" event={"ID":"02b36e40-d0aa-4580-b3af-ca16a64477d4","Type":"ContainerDied","Data":"299c25e643c0b91c5f4c3824f5fbfa4a7d6df8eb6235330680f74f9eece4daaa"} Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.654050 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a","Type":"ContainerStarted","Data":"52138f034bdb985ba072fda193b1b88536006aa074fc790f8e957dbdbc1f050b"} Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.655709 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.661902 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 08 22:40:19 crc kubenswrapper[4739]: I1008 22:40:19.725787 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.65584891 podStartE2EDuration="28.725764797s" podCreationTimestamp="2025-10-08 22:39:51 +0000 UTC" firstStartedPulling="2025-10-08 22:39:52.95135219 +0000 UTC m=+3092.776737940" lastFinishedPulling="2025-10-08 22:40:15.021268067 +0000 UTC m=+3114.846653827" observedRunningTime="2025-10-08 22:40:19.68324165 +0000 UTC m=+3119.508627400" watchObservedRunningTime="2025-10-08 22:40:19.725764797 +0000 UTC m=+3119.551150547" Oct 08 22:40:21 crc kubenswrapper[4739]: I1008 22:40:21.766183 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:40:21 crc kubenswrapper[4739]: I1008 22:40:21.766586 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:40:21 crc kubenswrapper[4739]: I1008 22:40:21.768297 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:40:21 crc kubenswrapper[4739]: I1008 22:40:21.770677 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:40:21 crc kubenswrapper[4739]: I1008 22:40:21.770890 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" gracePeriod=600 Oct 08 22:40:22 crc kubenswrapper[4739]: I1008 22:40:22.712688 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" exitCode=0 Oct 08 22:40:22 crc kubenswrapper[4739]: I1008 22:40:22.712815 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26"} Oct 08 22:40:22 crc kubenswrapper[4739]: I1008 22:40:22.713044 4739 scope.go:117] "RemoveContainer" containerID="b8557085ba70a8abc73279fad5e64c6b452e36ad1a22da0dfbf016f1eef65e90" Oct 08 22:40:28 crc kubenswrapper[4739]: I1008 22:40:28.282282 4739 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb2820d1f-6f24-4b4b-bde2-6474a45fdbff"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb2820d1f-6f24-4b4b-bde2-6474a45fdbff] : Timed out while waiting for systemd to remove kubepods-besteffort-podb2820d1f_6f24_4b4b_bde2_6474a45fdbff.slice" Oct 08 22:40:31 crc kubenswrapper[4739]: E1008 22:40:31.670194 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.790783 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.857980 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.858304 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:40:31 crc kubenswrapper[4739]: E1008 22:40:31.858550 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.859985 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jddsp\" (UniqueName: \"kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp\") pod \"02b36e40-d0aa-4580-b3af-ca16a64477d4\" (UID: \"02b36e40-d0aa-4580-b3af-ca16a64477d4\") " Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.860823 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-bcf2-account-create-ssjjf" event={"ID":"02b36e40-d0aa-4580-b3af-ca16a64477d4","Type":"ContainerDied","Data":"6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206"} Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.860860 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d18495643e8bb2ca09b4ead11af23777748c45f17757e17ef4a39c57bd23206" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.875126 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp" (OuterVolumeSpecName: "kube-api-access-jddsp") pod "02b36e40-d0aa-4580-b3af-ca16a64477d4" (UID: "02b36e40-d0aa-4580-b3af-ca16a64477d4"). InnerVolumeSpecName "kube-api-access-jddsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:40:31 crc kubenswrapper[4739]: I1008 22:40:31.962772 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jddsp\" (UniqueName: \"kubernetes.io/projected/02b36e40-d0aa-4580-b3af-ca16a64477d4-kube-api-access-jddsp\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.473684 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.474262 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwgw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-76cc998948-czjfz_openstack(4b1ae118-cd96-4e60-997d-9594acff7531): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.475494 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" podUID="4b1ae118-cd96-4e60-997d-9594acff7531" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.529120 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.529380 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8dd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-779849886d-v9qwb_openstack(137f65de-3030-4c4b-a087-c547dd183105): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.531462 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" podUID="137f65de-3030-4c4b-a087-c547dd183105" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.538921 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.539131 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h44xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-76cc998948-nf2wl_openstack(0663f463-0160-4cc2-bad3-389baee708da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.540294 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" podUID="0663f463-0160-4cc2-bad3-389baee708da" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.886229 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" podUID="4b1ae118-cd96-4e60-997d-9594acff7531" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.886743 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" podUID="137f65de-3030-4c4b-a087-c547dd183105" Oct 08 22:40:32 crc kubenswrapper[4739]: E1008 22:40:32.895720 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:710a1a5e486de5724469e55f29e9ff3f6cbef8cd4b2d21dfe254ede2b953c150\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" podUID="0663f463-0160-4cc2-bad3-389baee708da" Oct 08 22:40:33 crc kubenswrapper[4739]: E1008 22:40:33.165888 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified" Oct 08 22:40:33 crc kubenswrapper[4739]: E1008 22:40:33.166047 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvbnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bb8e455e-24e5-465b-ae83-87d51e01eb6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:33 crc kubenswrapper[4739]: E1008 22:40:33.167367 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.605689 4739 scope.go:117] "RemoveContainer" containerID="63a0476c7d88447a92314b52215fdd25104ceb323c7d4dac4afc199b63981b9c" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.647923 4739 scope.go:117] "RemoveContainer" containerID="728148e59b4f1beba5ec8102d7fce08f597f9b40b0c9bde4b278ce4a5a13f936" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.684906 4739 scope.go:117] "RemoveContainer" containerID="a5f45cc1ecd4316212b555383d1e27239456617a70321e7cc042b6c34dce6d5a" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.903131 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"02de38f3-1c70-4314-8dd7-4b5612c4348f","Type":"ContainerStarted","Data":"b1fee09803451ebc9862ee63b3e682a6f523b1d818c8a1b6522e665bb71c2aaf"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.903673 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.906139 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" event={"ID":"cd624632-67d1-48e1-8c43-fa58f5d2e5ea","Type":"ContainerStarted","Data":"e48b6eb26abc7be3ab522e211d847f9cf279d9a8b64763b5117d18d291031d6c"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.907364 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.911125 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4","Type":"ContainerStarted","Data":"c655a78bfbfec6bae62066e4851916348b0c3393c9857399e93eca08bfb90c16"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.911364 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.914430 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" event={"ID":"f8016740-3857-4c88-81a3-6ee47b7e2a75","Type":"ContainerStarted","Data":"518d24f1ae84d19abe2caf4d3f6fc96d575151b6c5307033a1fec605e06e04b7"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.914727 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.925412 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerStarted","Data":"f9bfe8ae20cfd26f16d80b7f273fae22cccc7329c52ce5e90180ccdcb0ac281f"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.931876 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"9d41d47c-0875-4283-908d-559995e5069e","Type":"ContainerStarted","Data":"e35c1dde7f939b3f59ae5652dd58713198ef51d7ac923b6d0da5072eefe3b1d5"} Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.932161 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:40:33 crc kubenswrapper[4739]: E1008 22:40:33.933278 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified\\\"\"" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.960537 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=8.68902465 podStartE2EDuration="25.960514055s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:16.02659778 +0000 UTC m=+3115.851983530" lastFinishedPulling="2025-10-08 22:40:33.298087185 +0000 UTC m=+3133.123472935" observedRunningTime="2025-10-08 22:40:33.925476074 +0000 UTC m=+3133.750861834" watchObservedRunningTime="2025-10-08 22:40:33.960514055 +0000 UTC m=+3133.785899795" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.962034 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=8.719980789 podStartE2EDuration="25.962022872s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:16.025067761 +0000 UTC m=+3115.850453501" lastFinishedPulling="2025-10-08 22:40:33.267109814 +0000 UTC m=+3133.092495584" observedRunningTime="2025-10-08 22:40:33.958611739 +0000 UTC m=+3133.783997489" watchObservedRunningTime="2025-10-08 22:40:33.962022872 +0000 UTC m=+3133.787408622" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.984070 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" podStartSLOduration=8.726930081 podStartE2EDuration="25.984052884s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:16.008402212 +0000 UTC m=+3115.833787962" lastFinishedPulling="2025-10-08 22:40:33.265525005 +0000 UTC m=+3133.090910765" observedRunningTime="2025-10-08 22:40:33.98103168 +0000 UTC m=+3133.806417430" watchObservedRunningTime="2025-10-08 22:40:33.984052884 +0000 UTC m=+3133.809438634" Oct 08 22:40:33 crc kubenswrapper[4739]: I1008 22:40:33.999378 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" podStartSLOduration=8.865285034 podStartE2EDuration="25.999355521s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:16.108628316 +0000 UTC m=+3115.934014066" lastFinishedPulling="2025-10-08 22:40:33.242698783 +0000 UTC m=+3133.068084553" observedRunningTime="2025-10-08 22:40:33.999009522 +0000 UTC m=+3133.824395272" watchObservedRunningTime="2025-10-08 22:40:33.999355521 +0000 UTC m=+3133.824741271" Oct 08 22:40:34 crc kubenswrapper[4739]: I1008 22:40:34.048764 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=8.773341933 podStartE2EDuration="26.048742195s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:15.99370567 +0000 UTC m=+3115.819091420" lastFinishedPulling="2025-10-08 22:40:33.269105922 +0000 UTC m=+3133.094491682" observedRunningTime="2025-10-08 22:40:34.040025301 +0000 UTC m=+3133.865411051" watchObservedRunningTime="2025-10-08 22:40:34.048742195 +0000 UTC m=+3133.874127945" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.163348 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-db9l2"] Oct 08 22:40:37 crc kubenswrapper[4739]: E1008 22:40:37.167424 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b36e40-d0aa-4580-b3af-ca16a64477d4" containerName="mariadb-account-create" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.167467 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b36e40-d0aa-4580-b3af-ca16a64477d4" containerName="mariadb-account-create" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.167844 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b36e40-d0aa-4580-b3af-ca16a64477d4" containerName="mariadb-account-create" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.168848 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.171530 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.171774 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-46j7t" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.173158 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.173366 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.177784 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-db9l2"] Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.286508 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.286552 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.286631 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q8v\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.286671 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.286703 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.388950 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.389002 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.389083 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q8v\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.389140 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.389204 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.395228 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.395701 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.396553 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.398613 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.408557 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q8v\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v\") pod \"cloudkitty-db-sync-db9l2\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.515008 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:40:37 crc kubenswrapper[4739]: I1008 22:40:37.977810 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerStarted","Data":"632213d92bad73c56614151948081d5770122d5a8051e3e12e6b82d7350af8c9"} Oct 08 22:40:38 crc kubenswrapper[4739]: W1008 22:40:38.050076 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303c9b70_1368_4372_824e_36bca64d2aff.slice/crio-60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163 WatchSource:0}: Error finding container 60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163: Status 404 returned error can't find the container with id 60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163 Oct 08 22:40:38 crc kubenswrapper[4739]: I1008 22:40:38.060847 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-db9l2"] Oct 08 22:40:38 crc kubenswrapper[4739]: I1008 22:40:38.989291 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-db9l2" event={"ID":"303c9b70-1368-4372-824e-36bca64d2aff","Type":"ContainerStarted","Data":"60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163"} Oct 08 22:40:41 crc kubenswrapper[4739]: I1008 22:40:41.014974 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0bc93384-c08a-4c7f-9dc4-318126297a8b","Type":"ContainerStarted","Data":"c2953f374ce5d1a6f3cd0e726a340963d5ddf27ba551ab96a07c42cf868c95dc"} Oct 08 22:40:41 crc kubenswrapper[4739]: I1008 22:40:41.056921 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.826546025 podStartE2EDuration="50.056899679s" podCreationTimestamp="2025-10-08 22:39:51 +0000 UTC" firstStartedPulling="2025-10-08 22:39:53.900348618 +0000 UTC m=+3093.725734368" lastFinishedPulling="2025-10-08 22:40:40.130702272 +0000 UTC m=+3139.956088022" observedRunningTime="2025-10-08 22:40:41.043362836 +0000 UTC m=+3140.868748586" watchObservedRunningTime="2025-10-08 22:40:41.056899679 +0000 UTC m=+3140.882285429" Oct 08 22:40:43 crc kubenswrapper[4739]: I1008 22:40:43.176253 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 08 22:40:45 crc kubenswrapper[4739]: I1008 22:40:45.821257 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:40:45 crc kubenswrapper[4739]: E1008 22:40:45.822683 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:40:46 crc kubenswrapper[4739]: I1008 22:40:46.079054 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" event={"ID":"4b1ae118-cd96-4e60-997d-9594acff7531","Type":"ContainerStarted","Data":"a2395a55319e784896a67fb14162eda5115358dd1f400067cf870c892e69d08e"} Oct 08 22:40:46 crc kubenswrapper[4739]: I1008 22:40:46.080017 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:46 crc kubenswrapper[4739]: I1008 22:40:46.082118 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" podUID="4b1ae118-cd96-4e60-997d-9594acff7531" containerName="gateway" probeResult="failure" output="Get \"https://10.217.1.3:8081/ready\": dial tcp 10.217.1.3:8081: connect: connection refused" Oct 08 22:40:46 crc kubenswrapper[4739]: I1008 22:40:46.103045 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" podStartSLOduration=9.118208203 podStartE2EDuration="38.103021779s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:15.994025398 +0000 UTC m=+3115.819411148" lastFinishedPulling="2025-10-08 22:40:44.978838974 +0000 UTC m=+3144.804224724" observedRunningTime="2025-10-08 22:40:46.097045181 +0000 UTC m=+3145.922430941" watchObservedRunningTime="2025-10-08 22:40:46.103021779 +0000 UTC m=+3145.928407529" Oct 08 22:40:47 crc kubenswrapper[4739]: I1008 22:40:47.122110 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-czjfz" Oct 08 22:40:48 crc kubenswrapper[4739]: I1008 22:40:48.904840 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-5dccl" Oct 08 22:40:49 crc kubenswrapper[4739]: I1008 22:40:49.036003 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-65fx4" Oct 08 22:40:50 crc kubenswrapper[4739]: I1008 22:40:50.177288 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 08 22:40:50 crc kubenswrapper[4739]: I1008 22:40:50.319124 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="9d41d47c-0875-4283-908d-559995e5069e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 22:40:50 crc kubenswrapper[4739]: I1008 22:40:50.705011 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 08 22:40:53 crc kubenswrapper[4739]: I1008 22:40:53.175999 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 08 22:40:53 crc kubenswrapper[4739]: I1008 22:40:53.189051 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 08 22:40:54 crc kubenswrapper[4739]: I1008 22:40:54.167003 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 08 22:40:57 crc kubenswrapper[4739]: I1008 22:40:57.821361 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:40:57 crc kubenswrapper[4739]: E1008 22:40:57.822138 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:40:59 crc kubenswrapper[4739]: E1008 22:40:59.586772 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/jwysogla/cloudkitty-api@sha256:5541d1160f777174a00982fde3c26a9b32ba156f9f140c9628f66d0eef834c86" Oct 08 22:40:59 crc kubenswrapper[4739]: E1008 22:40:59.587627 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.io/jwysogla/cloudkitty-api@sha256:5541d1160f777174a00982fde3c26a9b32ba156f9f140c9628f66d0eef834c86,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8q8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-db9l2_openstack(303c9b70-1368-4372-824e-36bca64d2aff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:59 crc kubenswrapper[4739]: E1008 22:40:59.588810 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-db9l2" podUID="303c9b70-1368-4372-824e-36bca64d2aff" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.239808 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" event={"ID":"137f65de-3030-4c4b-a087-c547dd183105","Type":"ContainerStarted","Data":"a1649d17ed48d6eb01f61dfe7d4b95d37f9255b59e005d70809b9322b2993654"} Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.240477 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.247125 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerStarted","Data":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.247424 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.249627 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" event={"ID":"0663f463-0160-4cc2-bad3-389baee708da","Type":"ContainerStarted","Data":"bd4116e26f0070aaa8c13f0a33bd857289c7a8f43ea3b8e17f74771db3d3b3d5"} Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.250331 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:41:00 crc kubenswrapper[4739]: E1008 22:41:00.251447 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/jwysogla/cloudkitty-api@sha256:5541d1160f777174a00982fde3c26a9b32ba156f9f140c9628f66d0eef834c86\\\"\"" pod="openstack/cloudkitty-db-sync-db9l2" podUID="303c9b70-1368-4372-824e-36bca64d2aff" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.269606 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.271447 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" podStartSLOduration=-9223371984.583347 podStartE2EDuration="52.271428727s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:15.533609676 +0000 UTC m=+3115.358995426" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:00.269314844 +0000 UTC m=+3160.094700604" watchObservedRunningTime="2025-10-08 22:41:00.271428727 +0000 UTC m=+3160.096814487" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.309514 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-nf2wl" podStartSLOduration=-9223371984.54528 podStartE2EDuration="52.309494923s" podCreationTimestamp="2025-10-08 22:40:08 +0000 UTC" firstStartedPulling="2025-10-08 22:40:15.986820811 +0000 UTC m=+3115.812206571" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:00.305356461 +0000 UTC m=+3160.130742221" watchObservedRunningTime="2025-10-08 22:41:00.309494923 +0000 UTC m=+3160.134880673" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.316871 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="9d41d47c-0875-4283-908d-559995e5069e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 22:41:00 crc kubenswrapper[4739]: I1008 22:41:00.334537 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.354909306 podStartE2EDuration="1m2.334518278s" podCreationTimestamp="2025-10-08 22:39:58 +0000 UTC" firstStartedPulling="2025-10-08 22:40:01.682677865 +0000 UTC m=+3101.508063615" lastFinishedPulling="2025-10-08 22:40:59.662286837 +0000 UTC m=+3159.487672587" observedRunningTime="2025-10-08 22:41:00.3305416 +0000 UTC m=+3160.155927360" watchObservedRunningTime="2025-10-08 22:41:00.334518278 +0000 UTC m=+3160.159904048" Oct 08 22:41:10 crc kubenswrapper[4739]: I1008 22:41:10.316373 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="9d41d47c-0875-4283-908d-559995e5069e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 22:41:11 crc kubenswrapper[4739]: I1008 22:41:11.828956 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:41:11 crc kubenswrapper[4739]: E1008 22:41:11.829655 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:41:16 crc kubenswrapper[4739]: I1008 22:41:16.408006 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-db9l2" event={"ID":"303c9b70-1368-4372-824e-36bca64d2aff","Type":"ContainerStarted","Data":"109a7b447b374b852953165b59b15ec371cf4b79af399612648bf1e1cbca2192"} Oct 08 22:41:16 crc kubenswrapper[4739]: I1008 22:41:16.433763 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-db9l2" podStartSLOduration=2.191452753 podStartE2EDuration="39.433742426s" podCreationTimestamp="2025-10-08 22:40:37 +0000 UTC" firstStartedPulling="2025-10-08 22:40:38.052652919 +0000 UTC m=+3137.878038669" lastFinishedPulling="2025-10-08 22:41:15.294942602 +0000 UTC m=+3175.120328342" observedRunningTime="2025-10-08 22:41:16.425128975 +0000 UTC m=+3176.250514755" watchObservedRunningTime="2025-10-08 22:41:16.433742426 +0000 UTC m=+3176.259128176" Oct 08 22:41:19 crc kubenswrapper[4739]: I1008 22:41:19.270835 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-v9qwb" Oct 08 22:41:19 crc kubenswrapper[4739]: I1008 22:41:19.446833 4739 generic.go:334] "Generic (PLEG): container finished" podID="303c9b70-1368-4372-824e-36bca64d2aff" containerID="109a7b447b374b852953165b59b15ec371cf4b79af399612648bf1e1cbca2192" exitCode=0 Oct 08 22:41:19 crc kubenswrapper[4739]: I1008 22:41:19.446873 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-db9l2" event={"ID":"303c9b70-1368-4372-824e-36bca64d2aff","Type":"ContainerDied","Data":"109a7b447b374b852953165b59b15ec371cf4b79af399612648bf1e1cbca2192"} Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.316936 4739 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="9d41d47c-0875-4283-908d-559995e5069e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.905172 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.960090 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs\") pod \"303c9b70-1368-4372-824e-36bca64d2aff\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.960161 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8q8v\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v\") pod \"303c9b70-1368-4372-824e-36bca64d2aff\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.960250 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle\") pod \"303c9b70-1368-4372-824e-36bca64d2aff\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.960328 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts\") pod \"303c9b70-1368-4372-824e-36bca64d2aff\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.960470 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data\") pod \"303c9b70-1368-4372-824e-36bca64d2aff\" (UID: \"303c9b70-1368-4372-824e-36bca64d2aff\") " Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.967532 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs" (OuterVolumeSpecName: "certs") pod "303c9b70-1368-4372-824e-36bca64d2aff" (UID: "303c9b70-1368-4372-824e-36bca64d2aff"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.967644 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v" (OuterVolumeSpecName: "kube-api-access-n8q8v") pod "303c9b70-1368-4372-824e-36bca64d2aff" (UID: "303c9b70-1368-4372-824e-36bca64d2aff"). InnerVolumeSpecName "kube-api-access-n8q8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:20 crc kubenswrapper[4739]: I1008 22:41:20.968569 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts" (OuterVolumeSpecName: "scripts") pod "303c9b70-1368-4372-824e-36bca64d2aff" (UID: "303c9b70-1368-4372-824e-36bca64d2aff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.004412 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "303c9b70-1368-4372-824e-36bca64d2aff" (UID: "303c9b70-1368-4372-824e-36bca64d2aff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.004523 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data" (OuterVolumeSpecName: "config-data") pod "303c9b70-1368-4372-824e-36bca64d2aff" (UID: "303c9b70-1368-4372-824e-36bca64d2aff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.062581 4739 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.062628 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8q8v\" (UniqueName: \"kubernetes.io/projected/303c9b70-1368-4372-824e-36bca64d2aff-kube-api-access-n8q8v\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.062651 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.062668 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.062683 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303c9b70-1368-4372-824e-36bca64d2aff-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.473261 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-db9l2" event={"ID":"303c9b70-1368-4372-824e-36bca64d2aff","Type":"ContainerDied","Data":"60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163"} Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.473304 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60692e8ddf57c6eb35b2726315c869c08fb682598ceefc2df94a83a989b8e163" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.473371 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-db9l2" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.598294 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-p7krf"] Oct 08 22:41:21 crc kubenswrapper[4739]: E1008 22:41:21.599573 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303c9b70-1368-4372-824e-36bca64d2aff" containerName="cloudkitty-db-sync" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.599609 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="303c9b70-1368-4372-824e-36bca64d2aff" containerName="cloudkitty-db-sync" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.600095 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="303c9b70-1368-4372-824e-36bca64d2aff" containerName="cloudkitty-db-sync" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.601516 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.603954 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-46j7t" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.604190 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.604690 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.606076 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.637016 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-p7krf"] Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.682651 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.682710 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.683196 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.683362 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.683919 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpl5f\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.785976 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpl5f\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.786071 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.786103 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.786272 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.786304 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.792156 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.808717 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.809711 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.811073 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.840234 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpl5f\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f\") pod \"cloudkitty-storageinit-p7krf\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.939854 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-46j7t" Oct 08 22:41:21 crc kubenswrapper[4739]: I1008 22:41:21.947258 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:22 crc kubenswrapper[4739]: I1008 22:41:22.390753 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-p7krf"] Oct 08 22:41:22 crc kubenswrapper[4739]: I1008 22:41:22.486860 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-p7krf" event={"ID":"e8c02797-6de9-48e2-9193-cd0ad90b93fa","Type":"ContainerStarted","Data":"1077a6dbdd66e8b287f381f1d53639cd7d6b50d230af5876fccabba429f8370f"} Oct 08 22:41:23 crc kubenswrapper[4739]: I1008 22:41:23.500502 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-p7krf" event={"ID":"e8c02797-6de9-48e2-9193-cd0ad90b93fa","Type":"ContainerStarted","Data":"c0bb71bd47c86038304f8dfe193ffcb8d7d57e7dca9b077a602ed13486b21d4f"} Oct 08 22:41:23 crc kubenswrapper[4739]: I1008 22:41:23.528348 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-p7krf" podStartSLOduration=2.528328872 podStartE2EDuration="2.528328872s" podCreationTimestamp="2025-10-08 22:41:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:23.525659786 +0000 UTC m=+3183.351045546" watchObservedRunningTime="2025-10-08 22:41:23.528328872 +0000 UTC m=+3183.353714622" Oct 08 22:41:24 crc kubenswrapper[4739]: I1008 22:41:24.517174 4739 generic.go:334] "Generic (PLEG): container finished" podID="e8c02797-6de9-48e2-9193-cd0ad90b93fa" containerID="c0bb71bd47c86038304f8dfe193ffcb8d7d57e7dca9b077a602ed13486b21d4f" exitCode=0 Oct 08 22:41:24 crc kubenswrapper[4739]: I1008 22:41:24.517352 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-p7krf" event={"ID":"e8c02797-6de9-48e2-9193-cd0ad90b93fa","Type":"ContainerDied","Data":"c0bb71bd47c86038304f8dfe193ffcb8d7d57e7dca9b077a602ed13486b21d4f"} Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.048560 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.122272 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data\") pod \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.122335 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts\") pod \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.122367 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle\") pod \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.122480 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpl5f\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f\") pod \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.122517 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs\") pod \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\" (UID: \"e8c02797-6de9-48e2-9193-cd0ad90b93fa\") " Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.128790 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs" (OuterVolumeSpecName: "certs") pod "e8c02797-6de9-48e2-9193-cd0ad90b93fa" (UID: "e8c02797-6de9-48e2-9193-cd0ad90b93fa"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.129486 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f" (OuterVolumeSpecName: "kube-api-access-gpl5f") pod "e8c02797-6de9-48e2-9193-cd0ad90b93fa" (UID: "e8c02797-6de9-48e2-9193-cd0ad90b93fa"). InnerVolumeSpecName "kube-api-access-gpl5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.129503 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts" (OuterVolumeSpecName: "scripts") pod "e8c02797-6de9-48e2-9193-cd0ad90b93fa" (UID: "e8c02797-6de9-48e2-9193-cd0ad90b93fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.153738 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data" (OuterVolumeSpecName: "config-data") pod "e8c02797-6de9-48e2-9193-cd0ad90b93fa" (UID: "e8c02797-6de9-48e2-9193-cd0ad90b93fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.153885 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8c02797-6de9-48e2-9193-cd0ad90b93fa" (UID: "e8c02797-6de9-48e2-9193-cd0ad90b93fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.225379 4739 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.225432 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.225450 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.225470 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c02797-6de9-48e2-9193-cd0ad90b93fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.225491 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpl5f\" (UniqueName: \"kubernetes.io/projected/e8c02797-6de9-48e2-9193-cd0ad90b93fa-kube-api-access-gpl5f\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.546726 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-p7krf" event={"ID":"e8c02797-6de9-48e2-9193-cd0ad90b93fa","Type":"ContainerDied","Data":"1077a6dbdd66e8b287f381f1d53639cd7d6b50d230af5876fccabba429f8370f"} Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.546786 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1077a6dbdd66e8b287f381f1d53639cd7d6b50d230af5876fccabba429f8370f" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.546804 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-p7krf" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.726501 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 08 22:41:26 crc kubenswrapper[4739]: E1008 22:41:26.727125 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c02797-6de9-48e2-9193-cd0ad90b93fa" containerName="cloudkitty-storageinit" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.727175 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c02797-6de9-48e2-9193-cd0ad90b93fa" containerName="cloudkitty-storageinit" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.727438 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c02797-6de9-48e2-9193-cd0ad90b93fa" containerName="cloudkitty-storageinit" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.728466 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.733777 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.734496 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.734701 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.735403 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-46j7t" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.735478 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.753273 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.822233 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:41:26 crc kubenswrapper[4739]: E1008 22:41:26.822443 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837731 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-certs\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837825 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837890 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837914 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837943 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.837977 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cttjq\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-kube-api-access-cttjq\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.899103 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.900849 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.905618 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.910504 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939249 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-scripts\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939305 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939326 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx5x9\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-kube-api-access-lx5x9\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939354 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939391 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939414 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939429 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939444 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-certs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939487 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cttjq\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-kube-api-access-cttjq\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939535 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939573 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939608 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43526663-2258-4b39-909c-1c52b4e217de-logs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.939636 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-certs\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.943050 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.944134 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.944567 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-certs\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.945736 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.953093 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5de0557c-aa06-41d3-8d90-76d22496c164-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:26 crc kubenswrapper[4739]: I1008 22:41:26.958831 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cttjq\" (UniqueName: \"kubernetes.io/projected/5de0557c-aa06-41d3-8d90-76d22496c164-kube-api-access-cttjq\") pod \"cloudkitty-proc-0\" (UID: \"5de0557c-aa06-41d3-8d90-76d22496c164\") " pod="openstack/cloudkitty-proc-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.041997 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-scripts\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042061 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx5x9\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-kube-api-access-lx5x9\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042110 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042192 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-certs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042276 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042310 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.042360 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43526663-2258-4b39-909c-1c52b4e217de-logs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.043083 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43526663-2258-4b39-909c-1c52b4e217de-logs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.046018 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.046138 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-certs\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.046674 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.053610 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-scripts\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.056183 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43526663-2258-4b39-909c-1c52b4e217de-config-data\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.056693 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.062653 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx5x9\" (UniqueName: \"kubernetes.io/projected/43526663-2258-4b39-909c-1c52b4e217de-kube-api-access-lx5x9\") pod \"cloudkitty-api-0\" (UID: \"43526663-2258-4b39-909c-1c52b4e217de\") " pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.217663 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.603680 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 08 22:41:27 crc kubenswrapper[4739]: I1008 22:41:27.775753 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.570674 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5de0557c-aa06-41d3-8d90-76d22496c164","Type":"ContainerStarted","Data":"77a25707872f7fd42d389fa169d47b2e99e4ca1141a99720a1cd5b0732ce6047"} Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.573097 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"43526663-2258-4b39-909c-1c52b4e217de","Type":"ContainerStarted","Data":"af43aee4416e340f11339b39d89be8f20f7104e2928e3e805ee462542580a823"} Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.573142 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"43526663-2258-4b39-909c-1c52b4e217de","Type":"ContainerStarted","Data":"dcfb99e975d50f00af3731c5adb3fbd2461ad481fee4e7d136cd3173b139a9d7"} Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.573172 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"43526663-2258-4b39-909c-1c52b4e217de","Type":"ContainerStarted","Data":"bd433977cfcb2c33ab1e70f27a0aa2e30c3e19d8b261ba396de906e7b55f1921"} Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.573321 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.596725 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.596704869 podStartE2EDuration="2.596704869s" podCreationTimestamp="2025-10-08 22:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:28.592116227 +0000 UTC m=+3188.417502017" watchObservedRunningTime="2025-10-08 22:41:28.596704869 +0000 UTC m=+3188.422090629" Oct 08 22:41:28 crc kubenswrapper[4739]: I1008 22:41:28.947854 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:41:30 crc kubenswrapper[4739]: I1008 22:41:30.317616 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 08 22:41:31 crc kubenswrapper[4739]: I1008 22:41:31.601398 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5de0557c-aa06-41d3-8d90-76d22496c164","Type":"ContainerStarted","Data":"fa4d3b0712aa95fe492ff7f09289456fa0087666ded3d35a760f345e7e168c31"} Oct 08 22:41:31 crc kubenswrapper[4739]: I1008 22:41:31.620539 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.174371734 podStartE2EDuration="5.620519469s" podCreationTimestamp="2025-10-08 22:41:26 +0000 UTC" firstStartedPulling="2025-10-08 22:41:27.605028343 +0000 UTC m=+3187.430414103" lastFinishedPulling="2025-10-08 22:41:31.051176088 +0000 UTC m=+3190.876561838" observedRunningTime="2025-10-08 22:41:31.61487585 +0000 UTC m=+3191.440261620" watchObservedRunningTime="2025-10-08 22:41:31.620519469 +0000 UTC m=+3191.445905219" Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.711089 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.712114 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-notification-agent" containerID="cri-o://49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" gracePeriod=30 Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.712134 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="proxy-httpd" containerID="cri-o://04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" gracePeriod=30 Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.712133 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="sg-core" containerID="cri-o://361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" gracePeriod=30 Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.712397 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-central-agent" containerID="cri-o://8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" gracePeriod=30 Oct 08 22:41:39 crc kubenswrapper[4739]: I1008 22:41:39.825191 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:41:39 crc kubenswrapper[4739]: E1008 22:41:39.825527 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.556991 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684448 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvbnm\" (UniqueName: \"kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684511 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684619 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684696 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684729 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.684770 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd\") pod \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\" (UID: \"bb8e455e-24e5-465b-ae83-87d51e01eb6a\") " Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.685511 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.685601 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.691406 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts" (OuterVolumeSpecName: "scripts") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.691535 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm" (OuterVolumeSpecName: "kube-api-access-mvbnm") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "kube-api-access-mvbnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.717476 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.720721 4739 generic.go:334] "Generic (PLEG): container finished" podID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" exitCode=0 Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721092 4739 generic.go:334] "Generic (PLEG): container finished" podID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" exitCode=2 Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721102 4739 generic.go:334] "Generic (PLEG): container finished" podID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" exitCode=0 Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721111 4739 generic.go:334] "Generic (PLEG): container finished" podID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" exitCode=0 Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.720797 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerDied","Data":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.720778 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721156 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerDied","Data":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721189 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerDied","Data":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721199 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerDied","Data":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721211 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8e455e-24e5-465b-ae83-87d51e01eb6a","Type":"ContainerDied","Data":"1c010394828d817b22e7fa8ddaf1cea24393a8a4dee70fe4f1ed287eec8cc08f"} Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.721231 4739 scope.go:117] "RemoveContainer" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.752376 4739 scope.go:117] "RemoveContainer" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.787655 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvbnm\" (UniqueName: \"kubernetes.io/projected/bb8e455e-24e5-465b-ae83-87d51e01eb6a-kube-api-access-mvbnm\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.787691 4739 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.787703 4739 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.787712 4739 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.787724 4739 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8e455e-24e5-465b-ae83-87d51e01eb6a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.788578 4739 scope.go:117] "RemoveContainer" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.806263 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data" (OuterVolumeSpecName: "config-data") pod "bb8e455e-24e5-465b-ae83-87d51e01eb6a" (UID: "bb8e455e-24e5-465b-ae83-87d51e01eb6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.808225 4739 scope.go:117] "RemoveContainer" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.833435 4739 scope.go:117] "RemoveContainer" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: E1008 22:41:40.838194 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": container with ID starting with 04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e not found: ID does not exist" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.838239 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} err="failed to get container status \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": rpc error: code = NotFound desc = could not find container \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": container with ID starting with 04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.838270 4739 scope.go:117] "RemoveContainer" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: E1008 22:41:40.838681 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": container with ID starting with 361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a not found: ID does not exist" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.838729 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} err="failed to get container status \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": rpc error: code = NotFound desc = could not find container \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": container with ID starting with 361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.838758 4739 scope.go:117] "RemoveContainer" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: E1008 22:41:40.841778 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": container with ID starting with 49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1 not found: ID does not exist" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.841803 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} err="failed to get container status \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": rpc error: code = NotFound desc = could not find container \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": container with ID starting with 49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1 not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.841823 4739 scope.go:117] "RemoveContainer" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: E1008 22:41:40.842175 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": container with ID starting with 8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe not found: ID does not exist" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.842216 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} err="failed to get container status \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": rpc error: code = NotFound desc = could not find container \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": container with ID starting with 8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.842245 4739 scope.go:117] "RemoveContainer" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.842701 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} err="failed to get container status \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": rpc error: code = NotFound desc = could not find container \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": container with ID starting with 04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.842727 4739 scope.go:117] "RemoveContainer" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843003 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} err="failed to get container status \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": rpc error: code = NotFound desc = could not find container \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": container with ID starting with 361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843033 4739 scope.go:117] "RemoveContainer" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843293 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} err="failed to get container status \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": rpc error: code = NotFound desc = could not find container \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": container with ID starting with 49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1 not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843313 4739 scope.go:117] "RemoveContainer" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843498 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} err="failed to get container status \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": rpc error: code = NotFound desc = could not find container \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": container with ID starting with 8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843526 4739 scope.go:117] "RemoveContainer" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843756 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} err="failed to get container status \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": rpc error: code = NotFound desc = could not find container \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": container with ID starting with 04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.843783 4739 scope.go:117] "RemoveContainer" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844063 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} err="failed to get container status \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": rpc error: code = NotFound desc = could not find container \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": container with ID starting with 361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844094 4739 scope.go:117] "RemoveContainer" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844446 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} err="failed to get container status \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": rpc error: code = NotFound desc = could not find container \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": container with ID starting with 49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1 not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844470 4739 scope.go:117] "RemoveContainer" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844679 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} err="failed to get container status \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": rpc error: code = NotFound desc = could not find container \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": container with ID starting with 8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844707 4739 scope.go:117] "RemoveContainer" containerID="04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844943 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e"} err="failed to get container status \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": rpc error: code = NotFound desc = could not find container \"04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e\": container with ID starting with 04bd79fca58be49000d7ee01f55c190ecabbf83b7d67d7ba96c98ce1128e316e not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.844967 4739 scope.go:117] "RemoveContainer" containerID="361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.845219 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a"} err="failed to get container status \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": rpc error: code = NotFound desc = could not find container \"361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a\": container with ID starting with 361eadd3c18cf7f9b20671dba908acc98886dd9bda46428a582697ae70b0316a not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.845242 4739 scope.go:117] "RemoveContainer" containerID="49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.845463 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1"} err="failed to get container status \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": rpc error: code = NotFound desc = could not find container \"49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1\": container with ID starting with 49cf330a86727ee6d73bacee6231512c6b3626eea160231e3de76e316eef10e1 not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.845494 4739 scope.go:117] "RemoveContainer" containerID="8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.845703 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe"} err="failed to get container status \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": rpc error: code = NotFound desc = could not find container \"8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe\": container with ID starting with 8964aeb2b47a196cb805da8e53d547accfc96bbb717ccdb0b0176a7f34720efe not found: ID does not exist" Oct 08 22:41:40 crc kubenswrapper[4739]: I1008 22:41:40.889510 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8e455e-24e5-465b-ae83-87d51e01eb6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.077386 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.093901 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.103186 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:41 crc kubenswrapper[4739]: E1008 22:41:41.103740 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="sg-core" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.103765 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="sg-core" Oct 08 22:41:41 crc kubenswrapper[4739]: E1008 22:41:41.103793 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-notification-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.103800 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-notification-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: E1008 22:41:41.103831 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="proxy-httpd" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.103869 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="proxy-httpd" Oct 08 22:41:41 crc kubenswrapper[4739]: E1008 22:41:41.103884 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-central-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.103890 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-central-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.104103 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-notification-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.104130 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="ceilometer-central-agent" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.104150 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="proxy-httpd" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.104223 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" containerName="sg-core" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.106245 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.111045 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jcpb9" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.111309 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.111968 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.115495 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.195390 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.195443 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb99z\" (UniqueName: \"kubernetes.io/projected/e7a33944-ef03-44c5-91a6-45cf63c795f8-kube-api-access-wb99z\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.195594 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-run-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.195809 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-config-data\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.195989 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-log-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.196224 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-scripts\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298150 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298232 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb99z\" (UniqueName: \"kubernetes.io/projected/e7a33944-ef03-44c5-91a6-45cf63c795f8-kube-api-access-wb99z\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298278 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-run-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298349 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-config-data\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298415 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-log-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298494 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-scripts\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298847 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-run-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.298921 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7a33944-ef03-44c5-91a6-45cf63c795f8-log-httpd\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.302698 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.302728 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-scripts\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.311233 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a33944-ef03-44c5-91a6-45cf63c795f8-config-data\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.314573 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb99z\" (UniqueName: \"kubernetes.io/projected/e7a33944-ef03-44c5-91a6-45cf63c795f8-kube-api-access-wb99z\") pod \"ceilometer-0\" (UID: \"e7a33944-ef03-44c5-91a6-45cf63c795f8\") " pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.425925 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.833345 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8e455e-24e5-465b-ae83-87d51e01eb6a" path="/var/lib/kubelet/pods/bb8e455e-24e5-465b-ae83-87d51e01eb6a/volumes" Oct 08 22:41:41 crc kubenswrapper[4739]: I1008 22:41:41.904622 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:41:41 crc kubenswrapper[4739]: W1008 22:41:41.905831 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7a33944_ef03_44c5_91a6_45cf63c795f8.slice/crio-7f26fb8b0fed58d1fcb5c408e2450aea88d9110f1f20e67c4f16ea691549b726 WatchSource:0}: Error finding container 7f26fb8b0fed58d1fcb5c408e2450aea88d9110f1f20e67c4f16ea691549b726: Status 404 returned error can't find the container with id 7f26fb8b0fed58d1fcb5c408e2450aea88d9110f1f20e67c4f16ea691549b726 Oct 08 22:41:42 crc kubenswrapper[4739]: I1008 22:41:42.744385 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7a33944-ef03-44c5-91a6-45cf63c795f8","Type":"ContainerStarted","Data":"7f26fb8b0fed58d1fcb5c408e2450aea88d9110f1f20e67c4f16ea691549b726"} Oct 08 22:41:44 crc kubenswrapper[4739]: I1008 22:41:44.778030 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7a33944-ef03-44c5-91a6-45cf63c795f8","Type":"ContainerStarted","Data":"b378e9917a6b8c80d9c43514e8e302a7fc02fac92d037fbc82ed9b1a484d2178"} Oct 08 22:41:44 crc kubenswrapper[4739]: I1008 22:41:44.778644 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7a33944-ef03-44c5-91a6-45cf63c795f8","Type":"ContainerStarted","Data":"93dc872bf989bbbb17358a999c6bc7f70ca298590e79335d2d8c5e414cea81a8"} Oct 08 22:41:45 crc kubenswrapper[4739]: I1008 22:41:45.790167 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7a33944-ef03-44c5-91a6-45cf63c795f8","Type":"ContainerStarted","Data":"2760cc6eaf321b1c9d7e07cb7bc0d6d99d00ddc084d61ecb4cd9f0518c3838c9"} Oct 08 22:41:46 crc kubenswrapper[4739]: I1008 22:41:46.805539 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7a33944-ef03-44c5-91a6-45cf63c795f8","Type":"ContainerStarted","Data":"cc566ab948a9471f0f9060e58d227f436302c3662f72affc71d92702f3cf9b7a"} Oct 08 22:41:46 crc kubenswrapper[4739]: I1008 22:41:46.805971 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:41:46 crc kubenswrapper[4739]: I1008 22:41:46.848020 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.836944125 podStartE2EDuration="5.84799885s" podCreationTimestamp="2025-10-08 22:41:41 +0000 UTC" firstStartedPulling="2025-10-08 22:41:41.908471513 +0000 UTC m=+3201.733857263" lastFinishedPulling="2025-10-08 22:41:45.919526238 +0000 UTC m=+3205.744911988" observedRunningTime="2025-10-08 22:41:46.827599439 +0000 UTC m=+3206.652985189" watchObservedRunningTime="2025-10-08 22:41:46.84799885 +0000 UTC m=+3206.673384620" Oct 08 22:41:51 crc kubenswrapper[4739]: I1008 22:41:51.830164 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:41:51 crc kubenswrapper[4739]: E1008 22:41:51.831214 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:04 crc kubenswrapper[4739]: I1008 22:42:04.074536 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Oct 08 22:42:05 crc kubenswrapper[4739]: I1008 22:42:05.823631 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:42:05 crc kubenswrapper[4739]: E1008 22:42:05.824103 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:11 crc kubenswrapper[4739]: I1008 22:42:11.430657 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:42:19 crc kubenswrapper[4739]: I1008 22:42:19.822208 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:42:19 crc kubenswrapper[4739]: E1008 22:42:19.823122 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:30 crc kubenswrapper[4739]: I1008 22:42:30.821712 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:42:30 crc kubenswrapper[4739]: E1008 22:42:30.822653 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:43 crc kubenswrapper[4739]: I1008 22:42:43.822385 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:42:43 crc kubenswrapper[4739]: E1008 22:42:43.823700 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:54 crc kubenswrapper[4739]: I1008 22:42:54.822051 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:42:54 crc kubenswrapper[4739]: E1008 22:42:54.822975 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:42:55 crc kubenswrapper[4739]: I1008 22:42:55.999471 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.002582 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.035910 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.048998 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6g22\" (UniqueName: \"kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.049209 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.049494 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.151882 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.152278 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6g22\" (UniqueName: \"kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.152378 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.152461 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.152891 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.171338 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6g22\" (UniqueName: \"kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22\") pod \"redhat-operators-rk9jz\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.358466 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:42:56 crc kubenswrapper[4739]: I1008 22:42:56.860079 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:42:57 crc kubenswrapper[4739]: I1008 22:42:57.593405 4739 generic.go:334] "Generic (PLEG): container finished" podID="36001358-3fff-491b-8a12-3713fdb53147" containerID="def4fc1ed67dbd4dabd871a2537c9a45d64231778f5eccb1b7dbc184b16fd5f9" exitCode=0 Oct 08 22:42:57 crc kubenswrapper[4739]: I1008 22:42:57.593492 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerDied","Data":"def4fc1ed67dbd4dabd871a2537c9a45d64231778f5eccb1b7dbc184b16fd5f9"} Oct 08 22:42:57 crc kubenswrapper[4739]: I1008 22:42:57.593765 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerStarted","Data":"aae21ea9e92d81a45b943864734f02eac6c24e67633decacf7ceec1452211728"} Oct 08 22:42:57 crc kubenswrapper[4739]: I1008 22:42:57.596361 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:42:59 crc kubenswrapper[4739]: I1008 22:42:59.615330 4739 generic.go:334] "Generic (PLEG): container finished" podID="36001358-3fff-491b-8a12-3713fdb53147" containerID="1ce4fb1c3ef8a5e09bb10b03a30af9924d45d53a47b00e9ac34b3ee3ba1ab0d7" exitCode=0 Oct 08 22:42:59 crc kubenswrapper[4739]: I1008 22:42:59.615437 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerDied","Data":"1ce4fb1c3ef8a5e09bb10b03a30af9924d45d53a47b00e9ac34b3ee3ba1ab0d7"} Oct 08 22:43:05 crc kubenswrapper[4739]: I1008 22:43:05.710012 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerStarted","Data":"67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec"} Oct 08 22:43:05 crc kubenswrapper[4739]: I1008 22:43:05.730487 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rk9jz" podStartSLOduration=3.693164009 podStartE2EDuration="10.730456287s" podCreationTimestamp="2025-10-08 22:42:55 +0000 UTC" firstStartedPulling="2025-10-08 22:42:57.595786636 +0000 UTC m=+3277.421172406" lastFinishedPulling="2025-10-08 22:43:04.633078914 +0000 UTC m=+3284.458464684" observedRunningTime="2025-10-08 22:43:05.725008433 +0000 UTC m=+3285.550394183" watchObservedRunningTime="2025-10-08 22:43:05.730456287 +0000 UTC m=+3285.555842037" Oct 08 22:43:06 crc kubenswrapper[4739]: I1008 22:43:06.358953 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:06 crc kubenswrapper[4739]: I1008 22:43:06.359019 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.408296 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rk9jz" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" probeResult="failure" output=< Oct 08 22:43:07 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:43:07 crc kubenswrapper[4739]: > Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.788818 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.791640 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.793893 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-db7xc" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.798728 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.799016 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.799323 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.815673 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969618 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969671 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969709 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969757 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969785 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969832 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969873 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969903 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzqz\" (UniqueName: \"kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:07 crc kubenswrapper[4739]: I1008 22:43:07.969924 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071189 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071290 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzqz\" (UniqueName: \"kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071317 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071399 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071422 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071450 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071476 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071503 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.071549 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.072459 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.072750 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.073082 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.073114 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.074498 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.080024 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.080206 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.080353 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.094408 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzqz\" (UniqueName: \"kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.103537 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.182109 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.670407 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.762992 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"def311ef-12ca-4d1a-972f-f2d72707a804","Type":"ContainerStarted","Data":"5d6b460ba82ff1269957f7c235a2e1ad7cc0aa344f87ae99f9d0592ee612f039"} Oct 08 22:43:08 crc kubenswrapper[4739]: I1008 22:43:08.821524 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:43:08 crc kubenswrapper[4739]: E1008 22:43:08.821889 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:43:16 crc kubenswrapper[4739]: I1008 22:43:16.414733 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:16 crc kubenswrapper[4739]: I1008 22:43:16.463504 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:16 crc kubenswrapper[4739]: I1008 22:43:16.647343 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:43:17 crc kubenswrapper[4739]: I1008 22:43:17.860030 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rk9jz" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" containerID="cri-o://67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" gracePeriod=2 Oct 08 22:43:20 crc kubenswrapper[4739]: I1008 22:43:20.895368 4739 generic.go:334] "Generic (PLEG): container finished" podID="36001358-3fff-491b-8a12-3713fdb53147" containerID="67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" exitCode=0 Oct 08 22:43:20 crc kubenswrapper[4739]: I1008 22:43:20.895441 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerDied","Data":"67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec"} Oct 08 22:43:21 crc kubenswrapper[4739]: I1008 22:43:21.854879 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:43:21 crc kubenswrapper[4739]: E1008 22:43:21.855713 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:43:26 crc kubenswrapper[4739]: E1008 22:43:26.359712 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec is running failed: container process not found" containerID="67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" cmd=["grpc_health_probe","-addr=:50051"] Oct 08 22:43:26 crc kubenswrapper[4739]: E1008 22:43:26.361396 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec is running failed: container process not found" containerID="67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" cmd=["grpc_health_probe","-addr=:50051"] Oct 08 22:43:26 crc kubenswrapper[4739]: E1008 22:43:26.362011 4739 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec is running failed: container process not found" containerID="67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" cmd=["grpc_health_probe","-addr=:50051"] Oct 08 22:43:26 crc kubenswrapper[4739]: E1008 22:43:26.362082 4739 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-rk9jz" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.124398 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.294797 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities\") pod \"36001358-3fff-491b-8a12-3713fdb53147\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.294932 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content\") pod \"36001358-3fff-491b-8a12-3713fdb53147\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.295231 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6g22\" (UniqueName: \"kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22\") pod \"36001358-3fff-491b-8a12-3713fdb53147\" (UID: \"36001358-3fff-491b-8a12-3713fdb53147\") " Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.295671 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities" (OuterVolumeSpecName: "utilities") pod "36001358-3fff-491b-8a12-3713fdb53147" (UID: "36001358-3fff-491b-8a12-3713fdb53147"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.296436 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.303359 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22" (OuterVolumeSpecName: "kube-api-access-z6g22") pod "36001358-3fff-491b-8a12-3713fdb53147" (UID: "36001358-3fff-491b-8a12-3713fdb53147"). InnerVolumeSpecName "kube-api-access-z6g22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.375257 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36001358-3fff-491b-8a12-3713fdb53147" (UID: "36001358-3fff-491b-8a12-3713fdb53147"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.398639 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36001358-3fff-491b-8a12-3713fdb53147-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.398666 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6g22\" (UniqueName: \"kubernetes.io/projected/36001358-3fff-491b-8a12-3713fdb53147-kube-api-access-z6g22\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.971320 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rk9jz" event={"ID":"36001358-3fff-491b-8a12-3713fdb53147","Type":"ContainerDied","Data":"aae21ea9e92d81a45b943864734f02eac6c24e67633decacf7ceec1452211728"} Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.971836 4739 scope.go:117] "RemoveContainer" containerID="67e1888757ad8796ec6a52879de67ca6d22f18cc2260b44027cfe34139315eec" Oct 08 22:43:27 crc kubenswrapper[4739]: I1008 22:43:27.971385 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rk9jz" Oct 08 22:43:28 crc kubenswrapper[4739]: I1008 22:43:28.008776 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:43:28 crc kubenswrapper[4739]: I1008 22:43:28.021000 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rk9jz"] Oct 08 22:43:29 crc kubenswrapper[4739]: I1008 22:43:29.837279 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36001358-3fff-491b-8a12-3713fdb53147" path="/var/lib/kubelet/pods/36001358-3fff-491b-8a12-3713fdb53147/volumes" Oct 08 22:43:32 crc kubenswrapper[4739]: I1008 22:43:32.823283 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:43:32 crc kubenswrapper[4739]: E1008 22:43:32.824416 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:43:40 crc kubenswrapper[4739]: I1008 22:43:40.142967 4739 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050157623s: [/var/lib/containers/storage/overlay/8411d3aa3343f9aeacf3c075819f4e0e996a59f3b16945942fdf7d801d4d17ea/diff /var/log/pods/openstack_barbican-api-56dcfd46c8-rpb55_9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59/barbican-api/0.log]; will not log again for this container unless duration exceeds 2s Oct 08 22:43:44 crc kubenswrapper[4739]: I1008 22:43:44.821952 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:43:44 crc kubenswrapper[4739]: E1008 22:43:44.823739 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:43:47 crc kubenswrapper[4739]: I1008 22:43:47.874009 4739 scope.go:117] "RemoveContainer" containerID="1ce4fb1c3ef8a5e09bb10b03a30af9924d45d53a47b00e9ac34b3ee3ba1ab0d7" Oct 08 22:43:47 crc kubenswrapper[4739]: I1008 22:43:47.912800 4739 scope.go:117] "RemoveContainer" containerID="def4fc1ed67dbd4dabd871a2537c9a45d64231778f5eccb1b7dbc184b16fd5f9" Oct 08 22:43:49 crc kubenswrapper[4739]: E1008 22:43:49.437265 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 08 22:43:49 crc kubenswrapper[4739]: E1008 22:43:49.437815 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvzqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(def311ef-12ca-4d1a-972f-f2d72707a804): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:43:49 crc kubenswrapper[4739]: E1008 22:43:49.439460 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="def311ef-12ca-4d1a-972f-f2d72707a804" Oct 08 22:43:50 crc kubenswrapper[4739]: E1008 22:43:50.214906 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="def311ef-12ca-4d1a-972f-f2d72707a804" Oct 08 22:43:55 crc kubenswrapper[4739]: I1008 22:43:55.822195 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:43:55 crc kubenswrapper[4739]: E1008 22:43:55.823026 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:44:05 crc kubenswrapper[4739]: I1008 22:44:05.165700 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 08 22:44:06 crc kubenswrapper[4739]: I1008 22:44:06.821509 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:44:06 crc kubenswrapper[4739]: E1008 22:44:06.822362 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:44:07 crc kubenswrapper[4739]: I1008 22:44:07.388758 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"def311ef-12ca-4d1a-972f-f2d72707a804","Type":"ContainerStarted","Data":"d02c22dbf8e77641d1602e15c7c48cff99006ded3832bda406ce974792343ee8"} Oct 08 22:44:07 crc kubenswrapper[4739]: I1008 22:44:07.410865 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.926611409 podStartE2EDuration="1m1.410846331s" podCreationTimestamp="2025-10-08 22:43:06 +0000 UTC" firstStartedPulling="2025-10-08 22:43:08.678698805 +0000 UTC m=+3288.504084555" lastFinishedPulling="2025-10-08 22:44:05.162933707 +0000 UTC m=+3344.988319477" observedRunningTime="2025-10-08 22:44:07.404474705 +0000 UTC m=+3347.229860455" watchObservedRunningTime="2025-10-08 22:44:07.410846331 +0000 UTC m=+3347.236232081" Oct 08 22:44:19 crc kubenswrapper[4739]: I1008 22:44:19.822899 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:44:19 crc kubenswrapper[4739]: E1008 22:44:19.823776 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:44:30 crc kubenswrapper[4739]: I1008 22:44:30.821860 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:44:30 crc kubenswrapper[4739]: E1008 22:44:30.822670 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.249280 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:32 crc kubenswrapper[4739]: E1008 22:44:32.250816 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.251196 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" Oct 08 22:44:32 crc kubenswrapper[4739]: E1008 22:44:32.251302 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="extract-content" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.251381 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="extract-content" Oct 08 22:44:32 crc kubenswrapper[4739]: E1008 22:44:32.251478 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="extract-utilities" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.251545 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="extract-utilities" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.251866 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="36001358-3fff-491b-8a12-3713fdb53147" containerName="registry-server" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.253896 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.277877 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.361535 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8d5m\" (UniqueName: \"kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.361609 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.361713 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.463682 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8d5m\" (UniqueName: \"kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.463952 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.464108 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.464687 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.464951 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.488421 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8d5m\" (UniqueName: \"kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m\") pod \"certified-operators-s6fr8\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:32 crc kubenswrapper[4739]: I1008 22:44:32.577323 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:33 crc kubenswrapper[4739]: I1008 22:44:33.429551 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:33 crc kubenswrapper[4739]: W1008 22:44:33.434322 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bad70ae_a444_4486_8dc8_9a76d4bf9442.slice/crio-452323825f1c6b4f816f1bbb5679aa3f90182d37a90fa02410cb5d38112cef7a WatchSource:0}: Error finding container 452323825f1c6b4f816f1bbb5679aa3f90182d37a90fa02410cb5d38112cef7a: Status 404 returned error can't find the container with id 452323825f1c6b4f816f1bbb5679aa3f90182d37a90fa02410cb5d38112cef7a Oct 08 22:44:33 crc kubenswrapper[4739]: I1008 22:44:33.674814 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerStarted","Data":"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5"} Oct 08 22:44:33 crc kubenswrapper[4739]: I1008 22:44:33.675063 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerStarted","Data":"452323825f1c6b4f816f1bbb5679aa3f90182d37a90fa02410cb5d38112cef7a"} Oct 08 22:44:34 crc kubenswrapper[4739]: I1008 22:44:34.686832 4739 generic.go:334] "Generic (PLEG): container finished" podID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerID="ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5" exitCode=0 Oct 08 22:44:34 crc kubenswrapper[4739]: I1008 22:44:34.686886 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerDied","Data":"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5"} Oct 08 22:44:34 crc kubenswrapper[4739]: I1008 22:44:34.686915 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerStarted","Data":"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68"} Oct 08 22:44:35 crc kubenswrapper[4739]: I1008 22:44:35.698166 4739 generic.go:334] "Generic (PLEG): container finished" podID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerID="934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68" exitCode=0 Oct 08 22:44:35 crc kubenswrapper[4739]: I1008 22:44:35.698235 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerDied","Data":"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68"} Oct 08 22:44:37 crc kubenswrapper[4739]: I1008 22:44:37.729774 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerStarted","Data":"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314"} Oct 08 22:44:37 crc kubenswrapper[4739]: I1008 22:44:37.756404 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s6fr8" podStartSLOduration=2.66060713 podStartE2EDuration="5.756387094s" podCreationTimestamp="2025-10-08 22:44:32 +0000 UTC" firstStartedPulling="2025-10-08 22:44:33.676962792 +0000 UTC m=+3373.502348542" lastFinishedPulling="2025-10-08 22:44:36.772742736 +0000 UTC m=+3376.598128506" observedRunningTime="2025-10-08 22:44:37.751171876 +0000 UTC m=+3377.576557626" watchObservedRunningTime="2025-10-08 22:44:37.756387094 +0000 UTC m=+3377.581772844" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.577956 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.578485 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.638188 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.819342 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.822183 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:44:42 crc kubenswrapper[4739]: E1008 22:44:42.822505 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:44:42 crc kubenswrapper[4739]: I1008 22:44:42.876265 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:44 crc kubenswrapper[4739]: I1008 22:44:44.792854 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s6fr8" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="registry-server" containerID="cri-o://fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314" gracePeriod=2 Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.436017 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.544567 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content\") pod \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.544753 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities\") pod \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.544850 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8d5m\" (UniqueName: \"kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m\") pod \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\" (UID: \"3bad70ae-a444-4486-8dc8-9a76d4bf9442\") " Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.545742 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities" (OuterVolumeSpecName: "utilities") pod "3bad70ae-a444-4486-8dc8-9a76d4bf9442" (UID: "3bad70ae-a444-4486-8dc8-9a76d4bf9442"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.554828 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m" (OuterVolumeSpecName: "kube-api-access-n8d5m") pod "3bad70ae-a444-4486-8dc8-9a76d4bf9442" (UID: "3bad70ae-a444-4486-8dc8-9a76d4bf9442"). InnerVolumeSpecName "kube-api-access-n8d5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.599054 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bad70ae-a444-4486-8dc8-9a76d4bf9442" (UID: "3bad70ae-a444-4486-8dc8-9a76d4bf9442"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.647136 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8d5m\" (UniqueName: \"kubernetes.io/projected/3bad70ae-a444-4486-8dc8-9a76d4bf9442-kube-api-access-n8d5m\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.647234 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.647243 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bad70ae-a444-4486-8dc8-9a76d4bf9442-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.805459 4739 generic.go:334] "Generic (PLEG): container finished" podID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerID="fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314" exitCode=0 Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.805525 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6fr8" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.805522 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerDied","Data":"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314"} Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.805592 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6fr8" event={"ID":"3bad70ae-a444-4486-8dc8-9a76d4bf9442","Type":"ContainerDied","Data":"452323825f1c6b4f816f1bbb5679aa3f90182d37a90fa02410cb5d38112cef7a"} Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.805612 4739 scope.go:117] "RemoveContainer" containerID="fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.828092 4739 scope.go:117] "RemoveContainer" containerID="934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.863045 4739 scope.go:117] "RemoveContainer" containerID="ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.866646 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.879702 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s6fr8"] Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.907560 4739 scope.go:117] "RemoveContainer" containerID="fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314" Oct 08 22:44:45 crc kubenswrapper[4739]: E1008 22:44:45.908091 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314\": container with ID starting with fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314 not found: ID does not exist" containerID="fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.908180 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314"} err="failed to get container status \"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314\": rpc error: code = NotFound desc = could not find container \"fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314\": container with ID starting with fb5f1c38c43b66e1a4fe81927d9d23db21bd1054b34dac8c2b91e0c77456c314 not found: ID does not exist" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.908213 4739 scope.go:117] "RemoveContainer" containerID="934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68" Oct 08 22:44:45 crc kubenswrapper[4739]: E1008 22:44:45.908726 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68\": container with ID starting with 934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68 not found: ID does not exist" containerID="934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.908807 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68"} err="failed to get container status \"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68\": rpc error: code = NotFound desc = could not find container \"934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68\": container with ID starting with 934a8bbb0e683462d8cec4101c8f3233e1ac3c51425524ae1a0d4ee107783d68 not found: ID does not exist" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.908852 4739 scope.go:117] "RemoveContainer" containerID="ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5" Oct 08 22:44:45 crc kubenswrapper[4739]: E1008 22:44:45.909362 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5\": container with ID starting with ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5 not found: ID does not exist" containerID="ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5" Oct 08 22:44:45 crc kubenswrapper[4739]: I1008 22:44:45.909495 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5"} err="failed to get container status \"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5\": rpc error: code = NotFound desc = could not find container \"ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5\": container with ID starting with ff2dfb5793811d389be0c62ea8731f9d59ea2fa8e693ed33be5e76859e6693c5 not found: ID does not exist" Oct 08 22:44:47 crc kubenswrapper[4739]: I1008 22:44:47.844955 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" path="/var/lib/kubelet/pods/3bad70ae-a444-4486-8dc8-9a76d4bf9442/volumes" Oct 08 22:44:53 crc kubenswrapper[4739]: I1008 22:44:53.821598 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:44:53 crc kubenswrapper[4739]: E1008 22:44:53.822515 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.169952 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5"] Oct 08 22:45:00 crc kubenswrapper[4739]: E1008 22:45:00.171086 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="registry-server" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.171106 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="registry-server" Oct 08 22:45:00 crc kubenswrapper[4739]: E1008 22:45:00.171131 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="extract-content" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.171139 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="extract-content" Oct 08 22:45:00 crc kubenswrapper[4739]: E1008 22:45:00.171202 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="extract-utilities" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.171214 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="extract-utilities" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.171478 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bad70ae-a444-4486-8dc8-9a76d4bf9442" containerName="registry-server" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.172444 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.178543 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.178555 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.183584 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5"] Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.245336 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.245405 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktrz\" (UniqueName: \"kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.245500 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.348207 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.348608 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktrz\" (UniqueName: \"kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.348696 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.350806 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.363494 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.375899 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktrz\" (UniqueName: \"kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz\") pod \"collect-profiles-29332725-sbrn5\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:00 crc kubenswrapper[4739]: I1008 22:45:00.508036 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:01 crc kubenswrapper[4739]: I1008 22:45:01.021441 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5"] Oct 08 22:45:01 crc kubenswrapper[4739]: I1008 22:45:01.973818 4739 generic.go:334] "Generic (PLEG): container finished" podID="3691e8f7-ec14-464b-92d9-104ed6ab2e28" containerID="c7e6cab3a1c317e380a528a46e0bc8d937318a64ab09eff85be00ff56b4fc736" exitCode=0 Oct 08 22:45:01 crc kubenswrapper[4739]: I1008 22:45:01.973885 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" event={"ID":"3691e8f7-ec14-464b-92d9-104ed6ab2e28","Type":"ContainerDied","Data":"c7e6cab3a1c317e380a528a46e0bc8d937318a64ab09eff85be00ff56b4fc736"} Oct 08 22:45:01 crc kubenswrapper[4739]: I1008 22:45:01.974103 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" event={"ID":"3691e8f7-ec14-464b-92d9-104ed6ab2e28","Type":"ContainerStarted","Data":"77c5cb96380c9dedcc12ed757954fed70bd4113e4ae5e5e5b29946803d50a5e7"} Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.451363 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.516426 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume\") pod \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.516572 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume\") pod \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.516598 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dktrz\" (UniqueName: \"kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz\") pod \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\" (UID: \"3691e8f7-ec14-464b-92d9-104ed6ab2e28\") " Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.517383 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume" (OuterVolumeSpecName: "config-volume") pod "3691e8f7-ec14-464b-92d9-104ed6ab2e28" (UID: "3691e8f7-ec14-464b-92d9-104ed6ab2e28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.522192 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz" (OuterVolumeSpecName: "kube-api-access-dktrz") pod "3691e8f7-ec14-464b-92d9-104ed6ab2e28" (UID: "3691e8f7-ec14-464b-92d9-104ed6ab2e28"). InnerVolumeSpecName "kube-api-access-dktrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.531549 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3691e8f7-ec14-464b-92d9-104ed6ab2e28" (UID: "3691e8f7-ec14-464b-92d9-104ed6ab2e28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.619299 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3691e8f7-ec14-464b-92d9-104ed6ab2e28-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.619333 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3691e8f7-ec14-464b-92d9-104ed6ab2e28-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.619345 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dktrz\" (UniqueName: \"kubernetes.io/projected/3691e8f7-ec14-464b-92d9-104ed6ab2e28-kube-api-access-dktrz\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.996621 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" event={"ID":"3691e8f7-ec14-464b-92d9-104ed6ab2e28","Type":"ContainerDied","Data":"77c5cb96380c9dedcc12ed757954fed70bd4113e4ae5e5e5b29946803d50a5e7"} Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.996668 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77c5cb96380c9dedcc12ed757954fed70bd4113e4ae5e5e5b29946803d50a5e7" Oct 08 22:45:03 crc kubenswrapper[4739]: I1008 22:45:03.996712 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-sbrn5" Oct 08 22:45:04 crc kubenswrapper[4739]: I1008 22:45:04.547203 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl"] Oct 08 22:45:04 crc kubenswrapper[4739]: I1008 22:45:04.561209 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332680-8c8vl"] Oct 08 22:45:05 crc kubenswrapper[4739]: I1008 22:45:05.821709 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:45:05 crc kubenswrapper[4739]: E1008 22:45:05.822424 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:45:05 crc kubenswrapper[4739]: I1008 22:45:05.848393 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2991a8-b959-40eb-80d1-dc78dbe9767e" path="/var/lib/kubelet/pods/0f2991a8-b959-40eb-80d1-dc78dbe9767e/volumes" Oct 08 22:45:18 crc kubenswrapper[4739]: I1008 22:45:18.822087 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:45:18 crc kubenswrapper[4739]: E1008 22:45:18.823007 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.698132 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:26 crc kubenswrapper[4739]: E1008 22:45:26.699038 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3691e8f7-ec14-464b-92d9-104ed6ab2e28" containerName="collect-profiles" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.699050 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3691e8f7-ec14-464b-92d9-104ed6ab2e28" containerName="collect-profiles" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.699331 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3691e8f7-ec14-464b-92d9-104ed6ab2e28" containerName="collect-profiles" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.700779 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.717605 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.809847 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.810270 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhrv\" (UniqueName: \"kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.810372 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.912084 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhrv\" (UniqueName: \"kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.912628 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.913158 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.913207 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.913698 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:26 crc kubenswrapper[4739]: I1008 22:45:26.931526 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhrv\" (UniqueName: \"kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv\") pod \"redhat-marketplace-vtx7l\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:27 crc kubenswrapper[4739]: I1008 22:45:27.076792 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:27 crc kubenswrapper[4739]: I1008 22:45:27.556599 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:28 crc kubenswrapper[4739]: I1008 22:45:28.252243 4739 generic.go:334] "Generic (PLEG): container finished" podID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerID="e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5" exitCode=0 Oct 08 22:45:28 crc kubenswrapper[4739]: I1008 22:45:28.252347 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerDied","Data":"e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5"} Oct 08 22:45:28 crc kubenswrapper[4739]: I1008 22:45:28.252641 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerStarted","Data":"f673afb151d46460595d1e0b163f846e70477a4afffe7fcb468c64a43a9eaf00"} Oct 08 22:45:29 crc kubenswrapper[4739]: I1008 22:45:29.262651 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerStarted","Data":"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f"} Oct 08 22:45:30 crc kubenswrapper[4739]: I1008 22:45:30.273522 4739 generic.go:334] "Generic (PLEG): container finished" podID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerID="4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f" exitCode=0 Oct 08 22:45:30 crc kubenswrapper[4739]: I1008 22:45:30.273691 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerDied","Data":"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f"} Oct 08 22:45:31 crc kubenswrapper[4739]: I1008 22:45:31.284066 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerStarted","Data":"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c"} Oct 08 22:45:31 crc kubenswrapper[4739]: I1008 22:45:31.309427 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vtx7l" podStartSLOduration=2.722778946 podStartE2EDuration="5.309406906s" podCreationTimestamp="2025-10-08 22:45:26 +0000 UTC" firstStartedPulling="2025-10-08 22:45:28.254235368 +0000 UTC m=+3428.079621118" lastFinishedPulling="2025-10-08 22:45:30.840863308 +0000 UTC m=+3430.666249078" observedRunningTime="2025-10-08 22:45:31.303610583 +0000 UTC m=+3431.128996333" watchObservedRunningTime="2025-10-08 22:45:31.309406906 +0000 UTC m=+3431.134792656" Oct 08 22:45:31 crc kubenswrapper[4739]: I1008 22:45:31.834436 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:45:32 crc kubenswrapper[4739]: I1008 22:45:32.306252 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105"} Oct 08 22:45:34 crc kubenswrapper[4739]: I1008 22:45:34.073191 4739 scope.go:117] "RemoveContainer" containerID="c1a7959ac37280608c279723f6e84c36ad9d97c6d581dc068563647139d4a20a" Oct 08 22:45:37 crc kubenswrapper[4739]: I1008 22:45:37.077086 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:37 crc kubenswrapper[4739]: I1008 22:45:37.079441 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:37 crc kubenswrapper[4739]: I1008 22:45:37.131904 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:37 crc kubenswrapper[4739]: I1008 22:45:37.429102 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:37 crc kubenswrapper[4739]: I1008 22:45:37.475325 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:39 crc kubenswrapper[4739]: I1008 22:45:39.386585 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vtx7l" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="registry-server" containerID="cri-o://89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c" gracePeriod=2 Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.236095 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.237541 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities\") pod \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.237583 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzhrv\" (UniqueName: \"kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv\") pod \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.237681 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content\") pod \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\" (UID: \"0df40e8c-0b70-48b2-84be-dec2ec7543cf\") " Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.239042 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities" (OuterVolumeSpecName: "utilities") pod "0df40e8c-0b70-48b2-84be-dec2ec7543cf" (UID: "0df40e8c-0b70-48b2-84be-dec2ec7543cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.251367 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv" (OuterVolumeSpecName: "kube-api-access-zzhrv") pod "0df40e8c-0b70-48b2-84be-dec2ec7543cf" (UID: "0df40e8c-0b70-48b2-84be-dec2ec7543cf"). InnerVolumeSpecName "kube-api-access-zzhrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.255219 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0df40e8c-0b70-48b2-84be-dec2ec7543cf" (UID: "0df40e8c-0b70-48b2-84be-dec2ec7543cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.339797 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.339842 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df40e8c-0b70-48b2-84be-dec2ec7543cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.339860 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzhrv\" (UniqueName: \"kubernetes.io/projected/0df40e8c-0b70-48b2-84be-dec2ec7543cf-kube-api-access-zzhrv\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.399165 4739 generic.go:334] "Generic (PLEG): container finished" podID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerID="89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c" exitCode=0 Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.399203 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerDied","Data":"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c"} Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.399231 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtx7l" event={"ID":"0df40e8c-0b70-48b2-84be-dec2ec7543cf","Type":"ContainerDied","Data":"f673afb151d46460595d1e0b163f846e70477a4afffe7fcb468c64a43a9eaf00"} Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.399251 4739 scope.go:117] "RemoveContainer" containerID="89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.399393 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtx7l" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.454007 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.458278 4739 scope.go:117] "RemoveContainer" containerID="4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.484059 4739 scope.go:117] "RemoveContainer" containerID="e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.515558 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtx7l"] Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.552228 4739 scope.go:117] "RemoveContainer" containerID="89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c" Oct 08 22:45:40 crc kubenswrapper[4739]: E1008 22:45:40.552665 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c\": container with ID starting with 89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c not found: ID does not exist" containerID="89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.552713 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c"} err="failed to get container status \"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c\": rpc error: code = NotFound desc = could not find container \"89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c\": container with ID starting with 89c679e7fd0602a833aa112b2b865c7840201d5ae784808bba4ccdde0e8fca9c not found: ID does not exist" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.552745 4739 scope.go:117] "RemoveContainer" containerID="4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f" Oct 08 22:45:40 crc kubenswrapper[4739]: E1008 22:45:40.553049 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f\": container with ID starting with 4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f not found: ID does not exist" containerID="4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.553082 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f"} err="failed to get container status \"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f\": rpc error: code = NotFound desc = could not find container \"4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f\": container with ID starting with 4566b2368e2cd3c44b0c3cd177ea2b730fc1b3e0ee5339a9aa82d633ea3f716f not found: ID does not exist" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.553103 4739 scope.go:117] "RemoveContainer" containerID="e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5" Oct 08 22:45:40 crc kubenswrapper[4739]: E1008 22:45:40.556613 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5\": container with ID starting with e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5 not found: ID does not exist" containerID="e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5" Oct 08 22:45:40 crc kubenswrapper[4739]: I1008 22:45:40.556685 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5"} err="failed to get container status \"e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5\": rpc error: code = NotFound desc = could not find container \"e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5\": container with ID starting with e454af40b1eb1b262360939afcd39fee1542bf8943d926528c45b1b911c44ac5 not found: ID does not exist" Oct 08 22:45:41 crc kubenswrapper[4739]: I1008 22:45:41.832383 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" path="/var/lib/kubelet/pods/0df40e8c-0b70-48b2-84be-dec2ec7543cf/volumes" Oct 08 22:47:51 crc kubenswrapper[4739]: I1008 22:47:51.766799 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:47:51 crc kubenswrapper[4739]: I1008 22:47:51.767344 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:48:21 crc kubenswrapper[4739]: I1008 22:48:21.766080 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:48:21 crc kubenswrapper[4739]: I1008 22:48:21.766720 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:48:45 crc kubenswrapper[4739]: I1008 22:48:45.407288 4739 generic.go:334] "Generic (PLEG): container finished" podID="def311ef-12ca-4d1a-972f-f2d72707a804" containerID="d02c22dbf8e77641d1602e15c7c48cff99006ded3832bda406ce974792343ee8" exitCode=0 Oct 08 22:48:45 crc kubenswrapper[4739]: I1008 22:48:45.407435 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"def311ef-12ca-4d1a-972f-f2d72707a804","Type":"ContainerDied","Data":"d02c22dbf8e77641d1602e15c7c48cff99006ded3832bda406ce974792343ee8"} Oct 08 22:48:46 crc kubenswrapper[4739]: I1008 22:48:46.963550 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096058 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096108 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096204 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096279 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096859 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzqz\" (UniqueName: \"kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096926 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.096958 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.097075 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.097104 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary\") pod \"def311ef-12ca-4d1a-972f-f2d72707a804\" (UID: \"def311ef-12ca-4d1a-972f-f2d72707a804\") " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.099637 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data" (OuterVolumeSpecName: "config-data") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.099293 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.102260 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.115413 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz" (OuterVolumeSpecName: "kube-api-access-vvzqz") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "kube-api-access-vvzqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.124600 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.158605 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.164897 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.173475 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.199990 4739 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200046 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzqz\" (UniqueName: \"kubernetes.io/projected/def311ef-12ca-4d1a-972f-f2d72707a804-kube-api-access-vvzqz\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200061 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200103 4739 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200119 4739 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200132 4739 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200160 4739 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def311ef-12ca-4d1a-972f-f2d72707a804-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.200172 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def311ef-12ca-4d1a-972f-f2d72707a804-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.224534 4739 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.302250 4739 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.447368 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"def311ef-12ca-4d1a-972f-f2d72707a804","Type":"ContainerDied","Data":"5d6b460ba82ff1269957f7c235a2e1ad7cc0aa344f87ae99f9d0592ee612f039"} Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.447412 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d6b460ba82ff1269957f7c235a2e1ad7cc0aa344f87ae99f9d0592ee612f039" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.447492 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.588024 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "def311ef-12ca-4d1a-972f-f2d72707a804" (UID: "def311ef-12ca-4d1a-972f-f2d72707a804"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:48:47 crc kubenswrapper[4739]: I1008 22:48:47.611538 4739 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/def311ef-12ca-4d1a-972f-f2d72707a804-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:51 crc kubenswrapper[4739]: I1008 22:48:51.766553 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:48:51 crc kubenswrapper[4739]: I1008 22:48:51.767406 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:48:51 crc kubenswrapper[4739]: I1008 22:48:51.767485 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:48:51 crc kubenswrapper[4739]: I1008 22:48:51.768704 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:48:51 crc kubenswrapper[4739]: I1008 22:48:51.768814 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105" gracePeriod=600 Oct 08 22:48:52 crc kubenswrapper[4739]: I1008 22:48:52.500696 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105" exitCode=0 Oct 08 22:48:52 crc kubenswrapper[4739]: I1008 22:48:52.500752 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105"} Oct 08 22:48:52 crc kubenswrapper[4739]: I1008 22:48:52.501305 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904"} Oct 08 22:48:52 crc kubenswrapper[4739]: I1008 22:48:52.501328 4739 scope.go:117] "RemoveContainer" containerID="ddb1d63bba56e73b176d2b1e90888b358b74bba2bf444f71cf0db8fe5e95ed26" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.598138 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 22:48:53 crc kubenswrapper[4739]: E1008 22:48:53.599229 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="extract-content" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599252 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="extract-content" Oct 08 22:48:53 crc kubenswrapper[4739]: E1008 22:48:53.599314 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="extract-utilities" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599329 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="extract-utilities" Oct 08 22:48:53 crc kubenswrapper[4739]: E1008 22:48:53.599365 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="registry-server" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599378 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="registry-server" Oct 08 22:48:53 crc kubenswrapper[4739]: E1008 22:48:53.599410 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def311ef-12ca-4d1a-972f-f2d72707a804" containerName="tempest-tests-tempest-tests-runner" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599423 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="def311ef-12ca-4d1a-972f-f2d72707a804" containerName="tempest-tests-tempest-tests-runner" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599797 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df40e8c-0b70-48b2-84be-dec2ec7543cf" containerName="registry-server" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.599820 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="def311ef-12ca-4d1a-972f-f2d72707a804" containerName="tempest-tests-tempest-tests-runner" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.601080 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.604888 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-db7xc" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.642798 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.642930 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcw87\" (UniqueName: \"kubernetes.io/projected/cd1babc0-3114-448d-b296-2c2680e08553-kube-api-access-mcw87\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.643314 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.744187 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcw87\" (UniqueName: \"kubernetes.io/projected/cd1babc0-3114-448d-b296-2c2680e08553-kube-api-access-mcw87\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.744306 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.744775 4739 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.766382 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcw87\" (UniqueName: \"kubernetes.io/projected/cd1babc0-3114-448d-b296-2c2680e08553-kube-api-access-mcw87\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.773695 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cd1babc0-3114-448d-b296-2c2680e08553\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:53 crc kubenswrapper[4739]: I1008 22:48:53.943213 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 08 22:48:54 crc kubenswrapper[4739]: I1008 22:48:54.450280 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 08 22:48:54 crc kubenswrapper[4739]: I1008 22:48:54.455867 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:48:54 crc kubenswrapper[4739]: I1008 22:48:54.525571 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cd1babc0-3114-448d-b296-2c2680e08553","Type":"ContainerStarted","Data":"6d58192b550a5e30f3e1042f34159e2ea45e37a1f8e14172e89ac1b71d2120ca"} Oct 08 22:48:56 crc kubenswrapper[4739]: I1008 22:48:56.562880 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cd1babc0-3114-448d-b296-2c2680e08553","Type":"ContainerStarted","Data":"736e0a3f4438705bab19ae7973b21aba003a99f12f9d98783a4bc9ada184acc3"} Oct 08 22:48:56 crc kubenswrapper[4739]: I1008 22:48:56.588789 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.532094828 podStartE2EDuration="3.58876328s" podCreationTimestamp="2025-10-08 22:48:53 +0000 UTC" firstStartedPulling="2025-10-08 22:48:54.455645478 +0000 UTC m=+3634.281031228" lastFinishedPulling="2025-10-08 22:48:55.51231392 +0000 UTC m=+3635.337699680" observedRunningTime="2025-10-08 22:48:56.57736118 +0000 UTC m=+3636.402746940" watchObservedRunningTime="2025-10-08 22:48:56.58876328 +0000 UTC m=+3636.414149040" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.288696 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pnz9/must-gather-xcjps"] Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.294663 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.304222 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7pnz9"/"default-dockercfg-qtfxs" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.304518 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7pnz9"/"kube-root-ca.crt" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.306062 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7pnz9"/"openshift-service-ca.crt" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.328432 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pnz9/must-gather-xcjps"] Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.401770 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2w7n\" (UniqueName: \"kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.401941 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.503871 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.504012 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2w7n\" (UniqueName: \"kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.504924 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.521950 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2w7n\" (UniqueName: \"kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n\") pod \"must-gather-xcjps\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:14 crc kubenswrapper[4739]: I1008 22:49:14.626721 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:49:15 crc kubenswrapper[4739]: I1008 22:49:15.150385 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pnz9/must-gather-xcjps"] Oct 08 22:49:15 crc kubenswrapper[4739]: I1008 22:49:15.781629 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/must-gather-xcjps" event={"ID":"3640a123-c313-4e96-a7df-5d38f7fd34f3","Type":"ContainerStarted","Data":"e318aa78fde1fde6d3925a573849d0b0e994e2064d2a06430a459794968d10a1"} Oct 08 22:49:20 crc kubenswrapper[4739]: I1008 22:49:20.842172 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/must-gather-xcjps" event={"ID":"3640a123-c313-4e96-a7df-5d38f7fd34f3","Type":"ContainerStarted","Data":"dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0"} Oct 08 22:49:20 crc kubenswrapper[4739]: I1008 22:49:20.842652 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/must-gather-xcjps" event={"ID":"3640a123-c313-4e96-a7df-5d38f7fd34f3","Type":"ContainerStarted","Data":"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a"} Oct 08 22:49:20 crc kubenswrapper[4739]: I1008 22:49:20.863498 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pnz9/must-gather-xcjps" podStartSLOduration=2.187034094 podStartE2EDuration="6.863475821s" podCreationTimestamp="2025-10-08 22:49:14 +0000 UTC" firstStartedPulling="2025-10-08 22:49:15.159609931 +0000 UTC m=+3654.984995671" lastFinishedPulling="2025-10-08 22:49:19.836051648 +0000 UTC m=+3659.661437398" observedRunningTime="2025-10-08 22:49:20.859606487 +0000 UTC m=+3660.684992247" watchObservedRunningTime="2025-10-08 22:49:20.863475821 +0000 UTC m=+3660.688861571" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.230407 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-cdxs4"] Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.233311 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.363330 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.363542 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbf9\" (UniqueName: \"kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.465035 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.465119 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.466258 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbf9\" (UniqueName: \"kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.486090 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbf9\" (UniqueName: \"kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9\") pod \"crc-debug-cdxs4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.554523 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:49:25 crc kubenswrapper[4739]: I1008 22:49:25.899525 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" event={"ID":"80cbe76f-63a1-4c25-8f24-78fc747291a4","Type":"ContainerStarted","Data":"8ec10295a7068c7e74e50c27aae065bf7a874896325285671a4c4e683b50f85a"} Oct 08 22:49:43 crc kubenswrapper[4739]: E1008 22:49:43.071614 4739 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Oct 08 22:49:43 crc kubenswrapper[4739]: E1008 22:49:43.072543 4739 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xbf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-cdxs4_openshift-must-gather-7pnz9(80cbe76f-63a1-4c25-8f24-78fc747291a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:49:43 crc kubenswrapper[4739]: E1008 22:49:43.073689 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" Oct 08 22:49:44 crc kubenswrapper[4739]: E1008 22:49:44.100823 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.062296 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.065016 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.077246 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.178094 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.178278 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dm65\" (UniqueName: \"kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.178339 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.280177 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dm65\" (UniqueName: \"kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.280275 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.280367 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.280951 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.281042 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.307707 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dm65\" (UniqueName: \"kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65\") pod \"community-operators-7v7kf\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.391287 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:49:53 crc kubenswrapper[4739]: I1008 22:49:53.964415 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:49:54 crc kubenswrapper[4739]: I1008 22:49:54.220467 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerStarted","Data":"0c0cd1071bed306c9c8c1264c62363a7237d808d046729dc93fd080ecabe8e6d"} Oct 08 22:49:55 crc kubenswrapper[4739]: I1008 22:49:55.230782 4739 generic.go:334] "Generic (PLEG): container finished" podID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerID="30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c" exitCode=0 Oct 08 22:49:55 crc kubenswrapper[4739]: I1008 22:49:55.231002 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerDied","Data":"30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c"} Oct 08 22:49:58 crc kubenswrapper[4739]: I1008 22:49:58.041764 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-6svpl"] Oct 08 22:49:58 crc kubenswrapper[4739]: I1008 22:49:58.049980 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-6svpl"] Oct 08 22:49:59 crc kubenswrapper[4739]: I1008 22:49:59.267735 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerStarted","Data":"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be"} Oct 08 22:49:59 crc kubenswrapper[4739]: I1008 22:49:59.833651 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2820d1f-6f24-4b4b-bde2-6474a45fdbff" path="/var/lib/kubelet/pods/b2820d1f-6f24-4b4b-bde2-6474a45fdbff/volumes" Oct 08 22:50:01 crc kubenswrapper[4739]: I1008 22:50:01.286813 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" event={"ID":"80cbe76f-63a1-4c25-8f24-78fc747291a4","Type":"ContainerStarted","Data":"640314485b2c1da5df0c9b336e101c5f16c459ec8fd289d79eb6a4b4d07de030"} Oct 08 22:50:01 crc kubenswrapper[4739]: I1008 22:50:01.303628 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" podStartSLOduration=1.840021195 podStartE2EDuration="36.303605159s" podCreationTimestamp="2025-10-08 22:49:25 +0000 UTC" firstStartedPulling="2025-10-08 22:49:25.596729345 +0000 UTC m=+3665.422115095" lastFinishedPulling="2025-10-08 22:50:00.060313309 +0000 UTC m=+3699.885699059" observedRunningTime="2025-10-08 22:50:01.300629556 +0000 UTC m=+3701.126015316" watchObservedRunningTime="2025-10-08 22:50:01.303605159 +0000 UTC m=+3701.128990909" Oct 08 22:50:08 crc kubenswrapper[4739]: I1008 22:50:08.358787 4739 generic.go:334] "Generic (PLEG): container finished" podID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerID="b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be" exitCode=0 Oct 08 22:50:08 crc kubenswrapper[4739]: I1008 22:50:08.358851 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerDied","Data":"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be"} Oct 08 22:50:09 crc kubenswrapper[4739]: I1008 22:50:09.373698 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerStarted","Data":"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346"} Oct 08 22:50:09 crc kubenswrapper[4739]: I1008 22:50:09.406851 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7v7kf" podStartSLOduration=2.658033231 podStartE2EDuration="16.406825046s" podCreationTimestamp="2025-10-08 22:49:53 +0000 UTC" firstStartedPulling="2025-10-08 22:49:55.232733607 +0000 UTC m=+3695.058119357" lastFinishedPulling="2025-10-08 22:50:08.981525422 +0000 UTC m=+3708.806911172" observedRunningTime="2025-10-08 22:50:09.397701592 +0000 UTC m=+3709.223087342" watchObservedRunningTime="2025-10-08 22:50:09.406825046 +0000 UTC m=+3709.232210796" Oct 08 22:50:13 crc kubenswrapper[4739]: I1008 22:50:13.391555 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:13 crc kubenswrapper[4739]: I1008 22:50:13.392260 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:14 crc kubenswrapper[4739]: I1008 22:50:14.467728 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7v7kf" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="registry-server" probeResult="failure" output=< Oct 08 22:50:14 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:50:14 crc kubenswrapper[4739]: > Oct 08 22:50:23 crc kubenswrapper[4739]: I1008 22:50:23.452410 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:23 crc kubenswrapper[4739]: I1008 22:50:23.503417 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:24 crc kubenswrapper[4739]: I1008 22:50:24.262344 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:50:24 crc kubenswrapper[4739]: I1008 22:50:24.515324 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7v7kf" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="registry-server" containerID="cri-o://98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346" gracePeriod=2 Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.145351 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.269196 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities\") pod \"b642b9a4-daa6-42aa-98a7-231154e8190b\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.269283 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dm65\" (UniqueName: \"kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65\") pod \"b642b9a4-daa6-42aa-98a7-231154e8190b\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.269411 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content\") pod \"b642b9a4-daa6-42aa-98a7-231154e8190b\" (UID: \"b642b9a4-daa6-42aa-98a7-231154e8190b\") " Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.270826 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities" (OuterVolumeSpecName: "utilities") pod "b642b9a4-daa6-42aa-98a7-231154e8190b" (UID: "b642b9a4-daa6-42aa-98a7-231154e8190b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.280394 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65" (OuterVolumeSpecName: "kube-api-access-7dm65") pod "b642b9a4-daa6-42aa-98a7-231154e8190b" (UID: "b642b9a4-daa6-42aa-98a7-231154e8190b"). InnerVolumeSpecName "kube-api-access-7dm65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.330592 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b642b9a4-daa6-42aa-98a7-231154e8190b" (UID: "b642b9a4-daa6-42aa-98a7-231154e8190b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.371794 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.371825 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dm65\" (UniqueName: \"kubernetes.io/projected/b642b9a4-daa6-42aa-98a7-231154e8190b-kube-api-access-7dm65\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.371838 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b642b9a4-daa6-42aa-98a7-231154e8190b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.526594 4739 generic.go:334] "Generic (PLEG): container finished" podID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerID="98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346" exitCode=0 Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.526635 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerDied","Data":"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346"} Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.526687 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7v7kf" event={"ID":"b642b9a4-daa6-42aa-98a7-231154e8190b","Type":"ContainerDied","Data":"0c0cd1071bed306c9c8c1264c62363a7237d808d046729dc93fd080ecabe8e6d"} Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.526685 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7v7kf" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.526703 4739 scope.go:117] "RemoveContainer" containerID="98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.553018 4739 scope.go:117] "RemoveContainer" containerID="b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.562967 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.572882 4739 scope.go:117] "RemoveContainer" containerID="30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.577845 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7v7kf"] Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.622314 4739 scope.go:117] "RemoveContainer" containerID="98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346" Oct 08 22:50:25 crc kubenswrapper[4739]: E1008 22:50:25.622853 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346\": container with ID starting with 98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346 not found: ID does not exist" containerID="98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.622904 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346"} err="failed to get container status \"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346\": rpc error: code = NotFound desc = could not find container \"98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346\": container with ID starting with 98667d7a16b64914bca275e9830f7d58774359afcf187c707dd52f83fb956346 not found: ID does not exist" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.622937 4739 scope.go:117] "RemoveContainer" containerID="b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be" Oct 08 22:50:25 crc kubenswrapper[4739]: E1008 22:50:25.623311 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be\": container with ID starting with b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be not found: ID does not exist" containerID="b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.623351 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be"} err="failed to get container status \"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be\": rpc error: code = NotFound desc = could not find container \"b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be\": container with ID starting with b7a3f9c8ead937563c413c170d40f5a7c5b794885a4fc5dab30a7088adde42be not found: ID does not exist" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.623374 4739 scope.go:117] "RemoveContainer" containerID="30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c" Oct 08 22:50:25 crc kubenswrapper[4739]: E1008 22:50:25.623747 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c\": container with ID starting with 30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c not found: ID does not exist" containerID="30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.623779 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c"} err="failed to get container status \"30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c\": rpc error: code = NotFound desc = could not find container \"30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c\": container with ID starting with 30ca5b3556c52d630b24e011f128138022f7ad53f556958b42d6718afa38556c not found: ID does not exist" Oct 08 22:50:25 crc kubenswrapper[4739]: I1008 22:50:25.852204 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" path="/var/lib/kubelet/pods/b642b9a4-daa6-42aa-98a7-231154e8190b/volumes" Oct 08 22:50:32 crc kubenswrapper[4739]: I1008 22:50:32.040034 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-bcf2-account-create-ssjjf"] Oct 08 22:50:32 crc kubenswrapper[4739]: I1008 22:50:32.048658 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-bcf2-account-create-ssjjf"] Oct 08 22:50:33 crc kubenswrapper[4739]: I1008 22:50:33.832137 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b36e40-d0aa-4580-b3af-ca16a64477d4" path="/var/lib/kubelet/pods/02b36e40-d0aa-4580-b3af-ca16a64477d4/volumes" Oct 08 22:50:34 crc kubenswrapper[4739]: I1008 22:50:34.297475 4739 scope.go:117] "RemoveContainer" containerID="1687f0c5c9ebf819674a04069568e88f62ea5dd74b1a42fec37aa5cf9c2a9156" Oct 08 22:50:34 crc kubenswrapper[4739]: I1008 22:50:34.319014 4739 scope.go:117] "RemoveContainer" containerID="299c25e643c0b91c5f4c3824f5fbfa4a7d6df8eb6235330680f74f9eece4daaa" Oct 08 22:50:39 crc kubenswrapper[4739]: I1008 22:50:39.880236 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/init-config-reloader/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.035832 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/init-config-reloader/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.111975 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/alertmanager/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.139916 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/config-reloader/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.326962 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56dcfd46c8-rpb55_9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59/barbican-api/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.364961 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56dcfd46c8-rpb55_9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59/barbican-api-log/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.596457 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-648fb84fdb-qmfb7_a8e9e5fe-49e3-4fea-8a8e-b853c479ce94/barbican-keystone-listener/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.724683 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-648fb84fdb-qmfb7_a8e9e5fe-49e3-4fea-8a8e-b853c479ce94/barbican-keystone-listener-log/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.805042 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66b4c9b85f-r8lds_1802465b-168a-449f-b8db-224a426d90ad/barbican-worker/0.log" Oct 08 22:50:40 crc kubenswrapper[4739]: I1008 22:50:40.944309 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66b4c9b85f-r8lds_1802465b-168a-449f-b8db-224a426d90ad/barbican-worker-log/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.318507 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7_bd1f9e00-5ba4-4aa0-b38c-8610f396af0b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.550682 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/ceilometer-central-agent/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.568919 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/ceilometer-notification-agent/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.614701 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/proxy-httpd/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.741941 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/sg-core/0.log" Oct 08 22:50:41 crc kubenswrapper[4739]: I1008 22:50:41.890513 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9/cinder-api/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.030532 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9/cinder-api-log/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.080095 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2c5f9170-35c8-4e75-ba48-955a58e56e3f/cinder-scheduler/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.304970 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2c5f9170-35c8-4e75-ba48-955a58e56e3f/probe/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.332273 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_43526663-2258-4b39-909c-1c52b4e217de/cloudkitty-api/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.514095 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-db-sync-db9l2_303c9b70-1368-4372-824e-36bca64d2aff/cloudkitty-db-sync/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.524312 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_43526663-2258-4b39-909c-1c52b4e217de/cloudkitty-api-log/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.784877 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_02de38f3-1c70-4314-8dd7-4b5612c4348f/loki-compactor/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.924026 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-5dccl_cd624632-67d1-48e1-8c43-fa58f5d2e5ea/loki-distributor/0.log" Oct 08 22:50:42 crc kubenswrapper[4739]: I1008 22:50:42.958404 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-czjfz_4b1ae118-cd96-4e60-997d-9594acff7531/gateway/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.090678 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-nf2wl_0663f463-0160-4cc2-bad3-389baee708da/gateway/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.165001 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4/loki-index-gateway/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.297000 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_9d41d47c-0875-4283-908d-559995e5069e/loki-ingester/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.347125 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-68bbd7984c-65fx4_f8016740-3857-4c88-81a3-6ee47b7e2a75/loki-querier/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.497937 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-v9qwb_137f65de-3030-4c4b-a087-c547dd183105/loki-query-frontend/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.783354 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-storageinit-p7krf_e8c02797-6de9-48e2-9193-cd0ad90b93fa/cloudkitty-storageinit/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.928100 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg_b87a15bb-7744-4904-91b9-9f8052912033/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:43 crc kubenswrapper[4739]: I1008 22:50:43.958890 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_5de0557c-aa06-41d3-8d90-76d22496c164/cloudkitty-proc/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.255407 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-82kg2_23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.418974 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grlzd_a3f415ab-75ff-469e-84f4-5d2e9f4053e2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.456527 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/init/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.639614 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/init/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.685568 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/dnsmasq-dns/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.728078 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9_b62c229c-107a-42de-8501-b52ae4c47f9f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.869183 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_66c23737-b27f-4ba2-9291-b2d0f3aa5020/glance-httpd/0.log" Oct 08 22:50:44 crc kubenswrapper[4739]: I1008 22:50:44.894056 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_66c23737-b27f-4ba2-9291-b2d0f3aa5020/glance-log/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.081396 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a00e6724-633b-4d60-9781-206e078a6dca/glance-httpd/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.113914 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a00e6724-633b-4d60-9781-206e078a6dca/glance-log/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.124521 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s_f2642ecf-dc6d-4f4e-94e7-2f76db914748/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.346215 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4kldb_9b1fcab8-e84d-433d-ac57-62a00dc6f557/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.614989 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55468d9c4f-z8pn5_7923886e-2cbf-489b-aabd-aa49c710fbf0/keystone-api/0.log" Oct 08 22:50:45 crc kubenswrapper[4739]: I1008 22:50:45.658236 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wthng_076ace7f-41ce-4825-9d2f-e49471648888/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:46 crc kubenswrapper[4739]: I1008 22:50:46.021758 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dc6d4cfc5-7ks2h_f2a09a54-dd22-4b47-b5bd-49685c152d9f/neutron-httpd/0.log" Oct 08 22:50:46 crc kubenswrapper[4739]: I1008 22:50:46.055589 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dc6d4cfc5-7ks2h_f2a09a54-dd22-4b47-b5bd-49685c152d9f/neutron-api/0.log" Oct 08 22:50:46 crc kubenswrapper[4739]: I1008 22:50:46.251056 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f_40321558-aaa1-4ba3-8417-69c969745cfa/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:46 crc kubenswrapper[4739]: I1008 22:50:46.695400 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_16fa8c46-856e-465c-bd60-fb13b76e5079/nova-api-log/0.log" Oct 08 22:50:46 crc kubenswrapper[4739]: I1008 22:50:46.885503 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_16fa8c46-856e-465c-bd60-fb13b76e5079/nova-api-api/0.log" Oct 08 22:50:47 crc kubenswrapper[4739]: I1008 22:50:47.007029 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_383c8e08-0c0f-41fb-9574-cfa23aa2aad5/nova-cell0-conductor-conductor/0.log" Oct 08 22:50:47 crc kubenswrapper[4739]: I1008 22:50:47.207074 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_48ac9d79-441d-4277-bf99-a8dc4ec2213c/nova-cell1-conductor-conductor/0.log" Oct 08 22:50:47 crc kubenswrapper[4739]: I1008 22:50:47.431003 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b2183def-3ac0-434f-bca8-dfd66210d7ab/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 22:50:47 crc kubenswrapper[4739]: I1008 22:50:47.625723 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-s2cpg_02cc9be0-080a-4ef8-a438-18607a5c7da4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:47 crc kubenswrapper[4739]: I1008 22:50:47.922467 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_99220499-d612-49e9-a7f1-622280a12221/nova-metadata-log/0.log" Oct 08 22:50:48 crc kubenswrapper[4739]: I1008 22:50:48.297941 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3/nova-scheduler-scheduler/0.log" Oct 08 22:50:48 crc kubenswrapper[4739]: I1008 22:50:48.395016 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/mysql-bootstrap/0.log" Oct 08 22:50:48 crc kubenswrapper[4739]: I1008 22:50:48.596952 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/galera/0.log" Oct 08 22:50:48 crc kubenswrapper[4739]: I1008 22:50:48.644067 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/mysql-bootstrap/0.log" Oct 08 22:50:48 crc kubenswrapper[4739]: I1008 22:50:48.837729 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/mysql-bootstrap/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.068041 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_99220499-d612-49e9-a7f1-622280a12221/nova-metadata-metadata/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.132933 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/mysql-bootstrap/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.139206 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/galera/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.354762 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9e972dc2-2718-4dcd-a49a-9d3199e95d61/openstackclient/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.469971 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hld4g_bc4ea068-4061-435b-8e62-11b14a3e1ec4/openstack-network-exporter/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.630873 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mj9gb_0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa/ovn-controller/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.771355 4739 generic.go:334] "Generic (PLEG): container finished" podID="80cbe76f-63a1-4c25-8f24-78fc747291a4" containerID="640314485b2c1da5df0c9b336e101c5f16c459ec8fd289d79eb6a4b4d07de030" exitCode=0 Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.771639 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" event={"ID":"80cbe76f-63a1-4c25-8f24-78fc747291a4","Type":"ContainerDied","Data":"640314485b2c1da5df0c9b336e101c5f16c459ec8fd289d79eb6a4b4d07de030"} Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.835091 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server-init/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.970080 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovs-vswitchd/0.log" Oct 08 22:50:49 crc kubenswrapper[4739]: I1008 22:50:49.981371 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server-init/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.020786 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.203946 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zh4p8_97d1ee4d-475f-4607-b01d-3d51e6ab179e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.373759 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c19423c-cec2-4fbf-b2bf-97a99db03043/openstack-network-exporter/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.393941 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c19423c-cec2-4fbf-b2bf-97a99db03043/ovn-northd/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.564724 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_00537745-c30b-4fa9-be09-0edb09ff7138/openstack-network-exporter/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.627830 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_00537745-c30b-4fa9-be09-0edb09ff7138/ovsdbserver-nb/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.781689 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_15d5a814-0c23-4e0f-b750-9f886dc130b6/openstack-network-exporter/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.873818 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_15d5a814-0c23-4e0f-b750-9f886dc130b6/ovsdbserver-sb/0.log" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.908663 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.952178 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-cdxs4"] Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.965074 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-cdxs4"] Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.986097 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbf9\" (UniqueName: \"kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9\") pod \"80cbe76f-63a1-4c25-8f24-78fc747291a4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.986251 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host\") pod \"80cbe76f-63a1-4c25-8f24-78fc747291a4\" (UID: \"80cbe76f-63a1-4c25-8f24-78fc747291a4\") " Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.986411 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host" (OuterVolumeSpecName: "host") pod "80cbe76f-63a1-4c25-8f24-78fc747291a4" (UID: "80cbe76f-63a1-4c25-8f24-78fc747291a4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:50:50 crc kubenswrapper[4739]: I1008 22:50:50.986913 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80cbe76f-63a1-4c25-8f24-78fc747291a4-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.005885 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9" (OuterVolumeSpecName: "kube-api-access-2xbf9") pod "80cbe76f-63a1-4c25-8f24-78fc747291a4" (UID: "80cbe76f-63a1-4c25-8f24-78fc747291a4"). InnerVolumeSpecName "kube-api-access-2xbf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.088262 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbf9\" (UniqueName: \"kubernetes.io/projected/80cbe76f-63a1-4c25-8f24-78fc747291a4-kube-api-access-2xbf9\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.094755 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c88568bb8-rh6ln_4de51af0-00c3-4a08-a13b-819a118cb604/placement-api/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.154740 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c88568bb8-rh6ln_4de51af0-00c3-4a08-a13b-819a118cb604/placement-log/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.303002 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/init-config-reloader/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.490073 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/init-config-reloader/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.493401 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/config-reloader/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.559916 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/prometheus/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.664342 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/thanos-sidecar/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.782546 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/setup-container/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.794942 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ec10295a7068c7e74e50c27aae065bf7a874896325285671a4c4e683b50f85a" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.795057 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-cdxs4" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.839763 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" path="/var/lib/kubelet/pods/80cbe76f-63a1-4c25-8f24-78fc747291a4/volumes" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.965822 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/setup-container/0.log" Oct 08 22:50:51 crc kubenswrapper[4739]: I1008 22:50:51.987938 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/rabbitmq/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.183266 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/setup-container/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214076 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-nkppv"] Oct 08 22:50:52 crc kubenswrapper[4739]: E1008 22:50:52.214532 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" containerName="container-00" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214550 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" containerName="container-00" Oct 08 22:50:52 crc kubenswrapper[4739]: E1008 22:50:52.214564 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="extract-utilities" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214570 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="extract-utilities" Oct 08 22:50:52 crc kubenswrapper[4739]: E1008 22:50:52.214579 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="registry-server" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214585 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="registry-server" Oct 08 22:50:52 crc kubenswrapper[4739]: E1008 22:50:52.214601 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="extract-content" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214608 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="extract-content" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214806 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cbe76f-63a1-4c25-8f24-78fc747291a4" containerName="container-00" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.214834 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="b642b9a4-daa6-42aa-98a7-231154e8190b" containerName="registry-server" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.215548 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.307028 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8fw6\" (UniqueName: \"kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.307075 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.409372 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8fw6\" (UniqueName: \"kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.409421 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.409586 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.412736 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/setup-container/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.414419 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/rabbitmq/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.430835 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8fw6\" (UniqueName: \"kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6\") pod \"crc-debug-nkppv\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.539104 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.657595 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m9wkc_e04c955f-d97c-4cc9-a01e-3d4d2b59de12/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.687648 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48_8ad88d67-b089-4777-be25-7c61f66c18c7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.808418 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" event={"ID":"ce3897c4-d3b8-41f6-9ff5-b80500339667","Type":"ContainerStarted","Data":"2002eaf2e2b4ab6965576c4d07c202cf23592fd2105e95f2df7ac9c3839042e9"} Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.808460 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" event={"ID":"ce3897c4-d3b8-41f6-9ff5-b80500339667","Type":"ContainerStarted","Data":"4fee40852808272973995eddd9929c87eb4e031359eb2b8d1696accee891fc4c"} Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.828732 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" podStartSLOduration=0.828714886 podStartE2EDuration="828.714886ms" podCreationTimestamp="2025-10-08 22:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:50:52.824509902 +0000 UTC m=+3752.649895652" watchObservedRunningTime="2025-10-08 22:50:52.828714886 +0000 UTC m=+3752.654100636" Oct 08 22:50:52 crc kubenswrapper[4739]: I1008 22:50:52.862753 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr_5536fb34-0051-4845-98e8-050b8870274d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.286171 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8nvgs_373291e0-4568-47e1-a71f-b2f005e5e557/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.380470 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p2w6r_3b70364c-a814-4e53-afb8-693faa5063ec/ssh-known-hosts-edpm-deployment/0.log" Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.627449 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c89ccbcd7-dlxrn_04e3fccb-ef13-4d04-9310-e1aec36adefe/proxy-server/0.log" Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.708046 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c89ccbcd7-dlxrn_04e3fccb-ef13-4d04-9310-e1aec36adefe/proxy-httpd/0.log" Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.824642 4739 generic.go:334] "Generic (PLEG): container finished" podID="ce3897c4-d3b8-41f6-9ff5-b80500339667" containerID="2002eaf2e2b4ab6965576c4d07c202cf23592fd2105e95f2df7ac9c3839042e9" exitCode=0 Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.831621 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" event={"ID":"ce3897c4-d3b8-41f6-9ff5-b80500339667","Type":"ContainerDied","Data":"2002eaf2e2b4ab6965576c4d07c202cf23592fd2105e95f2df7ac9c3839042e9"} Oct 08 22:50:53 crc kubenswrapper[4739]: I1008 22:50:53.851925 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-djtdv_2b6bdd10-ace2-453a-b0c9-d89051620215/swift-ring-rebalance/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.077798 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-auditor/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.089072 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-reaper/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.266706 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-replicator/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.292665 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-auditor/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.318434 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-server/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.503213 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-server/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.506164 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-replicator/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.526308 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-updater/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.714683 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-auditor/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.735347 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-expirer/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.750703 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-replicator/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.907809 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-server/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.927252 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/rsync/0.log" Oct 08 22:50:54 crc kubenswrapper[4739]: I1008 22:50:54.970344 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-updater/0.log" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.453635 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.486997 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-nkppv"] Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.494369 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-nkppv"] Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.577347 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host\") pod \"ce3897c4-d3b8-41f6-9ff5-b80500339667\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.577466 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host" (OuterVolumeSpecName: "host") pod "ce3897c4-d3b8-41f6-9ff5-b80500339667" (UID: "ce3897c4-d3b8-41f6-9ff5-b80500339667"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.578343 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8fw6\" (UniqueName: \"kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6\") pod \"ce3897c4-d3b8-41f6-9ff5-b80500339667\" (UID: \"ce3897c4-d3b8-41f6-9ff5-b80500339667\") " Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.579096 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce3897c4-d3b8-41f6-9ff5-b80500339667-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.584385 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6" (OuterVolumeSpecName: "kube-api-access-p8fw6") pod "ce3897c4-d3b8-41f6-9ff5-b80500339667" (UID: "ce3897c4-d3b8-41f6-9ff5-b80500339667"). InnerVolumeSpecName "kube-api-access-p8fw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.611857 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/swift-recon-cron/0.log" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.648973 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-br4sd_76b5c31e-7a34-42b9-9ad1-4f21fc560df3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.680889 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8fw6\" (UniqueName: \"kubernetes.io/projected/ce3897c4-d3b8-41f6-9ff5-b80500339667-kube-api-access-p8fw6\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.838088 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3897c4-d3b8-41f6-9ff5-b80500339667" path="/var/lib/kubelet/pods/ce3897c4-d3b8-41f6-9ff5-b80500339667/volumes" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.842783 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_def311ef-12ca-4d1a-972f-f2d72707a804/tempest-tests-tempest-tests-runner/0.log" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.849032 4739 scope.go:117] "RemoveContainer" containerID="2002eaf2e2b4ab6965576c4d07c202cf23592fd2105e95f2df7ac9c3839042e9" Oct 08 22:50:55 crc kubenswrapper[4739]: I1008 22:50:55.849104 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-nkppv" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.078430 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb_4b344b99-c3a5-4d79-ad85-b8589d6489b0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.094704 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cd1babc0-3114-448d-b296-2c2680e08553/test-operator-logs-container/0.log" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.702429 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-phgn4"] Oct 08 22:50:56 crc kubenswrapper[4739]: E1008 22:50:56.703153 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3897c4-d3b8-41f6-9ff5-b80500339667" containerName="container-00" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.703169 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3897c4-d3b8-41f6-9ff5-b80500339667" containerName="container-00" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.703363 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3897c4-d3b8-41f6-9ff5-b80500339667" containerName="container-00" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.704073 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.799476 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.799679 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8xd\" (UniqueName: \"kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.902174 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf8xd\" (UniqueName: \"kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.902218 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:56 crc kubenswrapper[4739]: I1008 22:50:56.902330 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.257856 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf8xd\" (UniqueName: \"kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd\") pod \"crc-debug-phgn4\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.327810 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:57 crc kubenswrapper[4739]: W1008 22:50:57.362884 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c264c17_c65a_432c_ae00_52ccb5a52202.slice/crio-31356ba869255c0bc8e0295aa31cb5cf0c62306a9895e9f1e383b5d63d044d7b WatchSource:0}: Error finding container 31356ba869255c0bc8e0295aa31cb5cf0c62306a9895e9f1e383b5d63d044d7b: Status 404 returned error can't find the container with id 31356ba869255c0bc8e0295aa31cb5cf0c62306a9895e9f1e383b5d63d044d7b Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.869406 4739 generic.go:334] "Generic (PLEG): container finished" podID="8c264c17-c65a-432c-ae00-52ccb5a52202" containerID="60824653ba47a202284ef887d44666c50dd8aca531f427b0adc7b23f5ef0967d" exitCode=0 Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.869446 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" event={"ID":"8c264c17-c65a-432c-ae00-52ccb5a52202","Type":"ContainerDied","Data":"60824653ba47a202284ef887d44666c50dd8aca531f427b0adc7b23f5ef0967d"} Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.869472 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" event={"ID":"8c264c17-c65a-432c-ae00-52ccb5a52202","Type":"ContainerStarted","Data":"31356ba869255c0bc8e0295aa31cb5cf0c62306a9895e9f1e383b5d63d044d7b"} Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.925257 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-phgn4"] Oct 08 22:50:57 crc kubenswrapper[4739]: I1008 22:50:57.931883 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pnz9/crc-debug-phgn4"] Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.000089 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.039840 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf8xd\" (UniqueName: \"kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd\") pod \"8c264c17-c65a-432c-ae00-52ccb5a52202\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.040134 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host\") pod \"8c264c17-c65a-432c-ae00-52ccb5a52202\" (UID: \"8c264c17-c65a-432c-ae00-52ccb5a52202\") " Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.040676 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host" (OuterVolumeSpecName: "host") pod "8c264c17-c65a-432c-ae00-52ccb5a52202" (UID: "8c264c17-c65a-432c-ae00-52ccb5a52202"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.047404 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd" (OuterVolumeSpecName: "kube-api-access-sf8xd") pod "8c264c17-c65a-432c-ae00-52ccb5a52202" (UID: "8c264c17-c65a-432c-ae00-52ccb5a52202"). InnerVolumeSpecName "kube-api-access-sf8xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.143127 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c264c17-c65a-432c-ae00-52ccb5a52202-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.143173 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf8xd\" (UniqueName: \"kubernetes.io/projected/8c264c17-c65a-432c-ae00-52ccb5a52202-kube-api-access-sf8xd\") on node \"crc\" DevicePath \"\"" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.832589 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c264c17-c65a-432c-ae00-52ccb5a52202" path="/var/lib/kubelet/pods/8c264c17-c65a-432c-ae00-52ccb5a52202/volumes" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.914037 4739 scope.go:117] "RemoveContainer" containerID="60824653ba47a202284ef887d44666c50dd8aca531f427b0adc7b23f5ef0967d" Oct 08 22:50:59 crc kubenswrapper[4739]: I1008 22:50:59.914253 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/crc-debug-phgn4" Oct 08 22:51:01 crc kubenswrapper[4739]: I1008 22:51:01.607309 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b6da1726-555b-4905-b565-611392fb8e67/memcached/0.log" Oct 08 22:51:21 crc kubenswrapper[4739]: I1008 22:51:21.045376 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-db9l2"] Oct 08 22:51:21 crc kubenswrapper[4739]: I1008 22:51:21.057716 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-db9l2"] Oct 08 22:51:21 crc kubenswrapper[4739]: I1008 22:51:21.766505 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:51:21 crc kubenswrapper[4739]: I1008 22:51:21.766570 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:51:21 crc kubenswrapper[4739]: I1008 22:51:21.841750 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303c9b70-1368-4372-824e-36bca64d2aff" path="/var/lib/kubelet/pods/303c9b70-1368-4372-824e-36bca64d2aff/volumes" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.342682 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-2c9sc_4ecad090-144b-491d-9307-dd0d2db07490/kube-rbac-proxy/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.343340 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-2c9sc_4ecad090-144b-491d-9307-dd0d2db07490/manager/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.490650 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-zk8hf_5aa8c8d1-9588-4e0f-87e2-b44b072bef76/kube-rbac-proxy/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.578507 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-zk8hf_5aa8c8d1-9588-4e0f-87e2-b44b072bef76/manager/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.651853 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-qhqv6_17ff0c7d-5595-4d2f-b77d-0f6114746fae/kube-rbac-proxy/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.747236 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-qhqv6_17ff0c7d-5595-4d2f-b77d-0f6114746fae/manager/0.log" Oct 08 22:51:22 crc kubenswrapper[4739]: I1008 22:51:22.785197 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.011162 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.019525 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.059315 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.187708 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/extract/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.201817 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.231902 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.398827 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wqmv7_8302b913-c934-4911-8c78-72d139019f33/kube-rbac-proxy/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.476604 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wqmv7_8302b913-c934-4911-8c78-72d139019f33/manager/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.495462 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-xnfkp_fae90c53-9891-4664-8767-98bfab1e021a/kube-rbac-proxy/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.665814 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-xnfkp_fae90c53-9891-4664-8767-98bfab1e021a/manager/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.679116 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-8djd9_dcd83df0-3381-4d08-9818-e7f91ba6f77b/kube-rbac-proxy/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.761641 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-8djd9_dcd83df0-3381-4d08-9818-e7f91ba6f77b/manager/0.log" Oct 08 22:51:23 crc kubenswrapper[4739]: I1008 22:51:23.919487 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-f7hv4_c11fdec0-87d4-41db-b5d3-66155b578abe/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.062612 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-f7hv4_c11fdec0-87d4-41db-b5d3-66155b578abe/manager/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.084365 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-6kp4x_f038b58d-e69a-481d-a0df-65211386c9da/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.128303 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-6kp4x_f038b58d-e69a-481d-a0df-65211386c9da/manager/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.311552 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-9ls4w_58ceaa51-6704-4f1d-8aa0-2053f1c7c89d/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.327386 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-9ls4w_58ceaa51-6704-4f1d-8aa0-2053f1c7c89d/manager/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.492689 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-l66d4_cf67473b-9a22-492a-844d-552fc946605d/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.570538 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-2sv26_102e8f33-2000-4a71-a337-4fa304d59e93/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.571523 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-l66d4_cf67473b-9a22-492a-844d-552fc946605d/manager/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.703399 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-2sv26_102e8f33-2000-4a71-a337-4fa304d59e93/manager/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.803324 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-fbhjp_7e483c3d-debb-4f41-a968-0d19d337e771/kube-rbac-proxy/0.log" Oct 08 22:51:24 crc kubenswrapper[4739]: I1008 22:51:24.805742 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-fbhjp_7e483c3d-debb-4f41-a968-0d19d337e771/manager/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.001723 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-8d2bb_23d030c7-6a61-4ba5-9b00-018f7370ea5d/kube-rbac-proxy/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.111973 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-wz5dg_d0a97575-d460-410d-84aa-887e6d809bba/kube-rbac-proxy/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.130884 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-8d2bb_23d030c7-6a61-4ba5-9b00-018f7370ea5d/manager/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.262318 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-wz5dg_d0a97575-d460-410d-84aa-887e6d809bba/manager/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.319751 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67_c165e9bc-4624-4227-8a87-835cbfe8a970/kube-rbac-proxy/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.339404 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67_c165e9bc-4624-4227-8a87-835cbfe8a970/manager/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.619615 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb8b8594d-rkq5g_effe458d-330b-4b50-9a32-bb44bc0008ca/kube-rbac-proxy/0.log" Oct 08 22:51:25 crc kubenswrapper[4739]: I1008 22:51:25.867840 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fbb97f6f4-lh6lp_f65a9137-93d0-424e-a839-3429f141ffa7/kube-rbac-proxy/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.030283 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-p7krf"] Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.039096 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-p7krf"] Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.078019 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fbb97f6f4-lh6lp_f65a9137-93d0-424e-a839-3429f141ffa7/operator/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.080883 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wfp9g_99e801cf-5f88-48ef-8193-516d2cc2bf14/registry-server/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.141325 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-qn2t5_41df7676-d0f5-47a1-a90c-2bc3bc01e18d/kube-rbac-proxy/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.310883 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-qn2t5_41df7676-d0f5-47a1-a90c-2bc3bc01e18d/manager/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.369585 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9vz2h_e51cba46-23fd-4f5f-819d-c2e0ee77a743/manager/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.400574 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9vz2h_e51cba46-23fd-4f5f-819d-c2e0ee77a743/kube-rbac-proxy/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.582703 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4msd5_0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf/operator/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.629965 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-7gjhx_dd2b6037-9b5d-47cb-b057-d33546b8e74c/kube-rbac-proxy/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.753640 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb8b8594d-rkq5g_effe458d-330b-4b50-9a32-bb44bc0008ca/manager/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.822871 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75d7f5797c-czmc9_aff0dabb-b21e-4507-8a13-1d391b8c4f52/kube-rbac-proxy/0.log" Oct 08 22:51:26 crc kubenswrapper[4739]: I1008 22:51:26.884335 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-7gjhx_dd2b6037-9b5d-47cb-b057-d33546b8e74c/manager/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.003825 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-c6dtc_18bf5f4d-f183-41c2-b1c9-a965baab8f5d/kube-rbac-proxy/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.051177 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75d7f5797c-czmc9_aff0dabb-b21e-4507-8a13-1d391b8c4f52/manager/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.084912 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-c6dtc_18bf5f4d-f183-41c2-b1c9-a965baab8f5d/manager/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.167403 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-l9vl5_04c34d21-ba2d-4418-83e2-ba162c64cc1e/kube-rbac-proxy/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.215595 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-l9vl5_04c34d21-ba2d-4418-83e2-ba162c64cc1e/manager/0.log" Oct 08 22:51:27 crc kubenswrapper[4739]: I1008 22:51:27.837221 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c02797-6de9-48e2-9193-cd0ad90b93fa" path="/var/lib/kubelet/pods/e8c02797-6de9-48e2-9193-cd0ad90b93fa/volumes" Oct 08 22:51:34 crc kubenswrapper[4739]: I1008 22:51:34.455698 4739 scope.go:117] "RemoveContainer" containerID="109a7b447b374b852953165b59b15ec371cf4b79af399612648bf1e1cbca2192" Oct 08 22:51:34 crc kubenswrapper[4739]: I1008 22:51:34.501267 4739 scope.go:117] "RemoveContainer" containerID="c0bb71bd47c86038304f8dfe193ffcb8d7d57e7dca9b077a602ed13486b21d4f" Oct 08 22:51:44 crc kubenswrapper[4739]: I1008 22:51:44.175295 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rb8dx_a0840280-c534-4e58-9095-a87e9acb799a/control-plane-machine-set-operator/0.log" Oct 08 22:51:44 crc kubenswrapper[4739]: I1008 22:51:44.339783 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8679r_abd1c1de-f12b-48d3-9687-54025a7daa56/kube-rbac-proxy/0.log" Oct 08 22:51:44 crc kubenswrapper[4739]: I1008 22:51:44.356872 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8679r_abd1c1de-f12b-48d3-9687-54025a7daa56/machine-api-operator/0.log" Oct 08 22:51:51 crc kubenswrapper[4739]: I1008 22:51:51.766385 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:51:51 crc kubenswrapper[4739]: I1008 22:51:51.767802 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:51:56 crc kubenswrapper[4739]: I1008 22:51:56.066849 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-69d28_1b95f6e0-c1ca-4d45-82df-49d302f081ec/cert-manager-controller/0.log" Oct 08 22:51:56 crc kubenswrapper[4739]: I1008 22:51:56.219681 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qqgbk_40bc8c1c-ef2f-4374-b80d-f402929336c3/cert-manager-cainjector/0.log" Oct 08 22:51:56 crc kubenswrapper[4739]: I1008 22:51:56.236975 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hmtjz_551e5827-2d65-48ab-90a0-6e46341e2292/cert-manager-webhook/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.443341 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-fcg68_d26121b1-736e-49d5-9241-bd8e8e7706c5/nmstate-console-plugin/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.655928 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lj8qk_64c2a4a6-267c-484e-b36f-95d7540531ef/nmstate-handler/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.660740 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pz8z2_ad883073-96a0-4558-9517-2f59f2e1472e/kube-rbac-proxy/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.690844 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pz8z2_ad883073-96a0-4558-9517-2f59f2e1472e/nmstate-metrics/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.840121 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-mt6zh_d8c93273-ced1-4664-9585-47ce49a29326/nmstate-operator/0.log" Oct 08 22:52:07 crc kubenswrapper[4739]: I1008 22:52:07.908284 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-pjkfn_dd0f36c5-926b-4678-b2d0-342a3f2f1d1f/nmstate-webhook/0.log" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.475090 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/kube-rbac-proxy/0.log" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.531253 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/manager/0.log" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.765948 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.766013 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.766060 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.767017 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:52:21 crc kubenswrapper[4739]: I1008 22:52:21.767087 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" gracePeriod=600 Oct 08 22:52:21 crc kubenswrapper[4739]: E1008 22:52:21.896087 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:52:22 crc kubenswrapper[4739]: I1008 22:52:22.700249 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" exitCode=0 Oct 08 22:52:22 crc kubenswrapper[4739]: I1008 22:52:22.700445 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904"} Oct 08 22:52:22 crc kubenswrapper[4739]: I1008 22:52:22.700640 4739 scope.go:117] "RemoveContainer" containerID="7de8ba5fe610ee36041a9b93b47373e73c6abc507c69b97a48a00864c74b8105" Oct 08 22:52:22 crc kubenswrapper[4739]: I1008 22:52:22.701530 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:52:22 crc kubenswrapper[4739]: E1008 22:52:22.701867 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:52:33 crc kubenswrapper[4739]: I1008 22:52:33.824011 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:52:33 crc kubenswrapper[4739]: E1008 22:52:33.824863 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.332336 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-ps6c2_714c1b10-3e7c-4a8a-a346-8e37f9f476e6/kube-rbac-proxy/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.539356 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-ps6c2_714c1b10-3e7c-4a8a-a346-8e37f9f476e6/controller/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.591303 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.756528 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.779964 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.788183 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:52:35 crc kubenswrapper[4739]: I1008 22:52:35.793296 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.007573 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.041865 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.049265 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.058967 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.213746 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.245198 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/controller/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.245635 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.290134 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.440866 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/frr-metrics/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.449904 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/kube-rbac-proxy/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.520739 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/kube-rbac-proxy-frr/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.660025 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/reloader/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.746735 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-rmj6d_d4c4cac2-1e41-4504-8620-7ccda1212854/frr-k8s-webhook-server/0.log" Oct 08 22:52:36 crc kubenswrapper[4739]: I1008 22:52:36.934070 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cb897566c-v8wrp_091a7a04-1c08-4327-8d95-e63d3b526055/manager/0.log" Oct 08 22:52:37 crc kubenswrapper[4739]: I1008 22:52:37.192122 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5944674dc5-rrrsh_1b3a65cd-e578-4a5b-acfe-47ec21816d80/webhook-server/0.log" Oct 08 22:52:37 crc kubenswrapper[4739]: I1008 22:52:37.228065 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kn5nb_6f0c8acb-ceae-4aea-861e-396755963f03/kube-rbac-proxy/0.log" Oct 08 22:52:38 crc kubenswrapper[4739]: I1008 22:52:38.020509 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kn5nb_6f0c8acb-ceae-4aea-861e-396755963f03/speaker/0.log" Oct 08 22:52:38 crc kubenswrapper[4739]: I1008 22:52:38.051709 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/frr/0.log" Oct 08 22:52:45 crc kubenswrapper[4739]: I1008 22:52:45.822191 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:52:45 crc kubenswrapper[4739]: E1008 22:52:45.822994 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.431309 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.627465 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.631393 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.662585 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.794390 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.815611 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.860086 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/extract/0.log" Oct 08 22:52:51 crc kubenswrapper[4739]: I1008 22:52:51.987061 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.126309 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.128825 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.187523 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.338661 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.341403 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.372245 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/extract/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.499057 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.673182 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.683207 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:52:52 crc kubenswrapper[4739]: I1008 22:52:52.686396 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.306134 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/extract/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.316309 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.317004 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.472473 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.671189 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.701173 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.741102 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.852661 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.859232 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:52:53 crc kubenswrapper[4739]: I1008 22:52:53.870867 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/extract/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.031098 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.223779 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.233676 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.267469 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.416250 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.423905 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.569064 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.733500 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.770685 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.824840 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.885938 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/registry-server/0.log" Oct 08 22:52:54 crc kubenswrapper[4739]: I1008 22:52:54.978617 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.033495 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.110605 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.450970 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.458718 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.510121 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.665943 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.677967 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/extract/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.701520 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/registry-server/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.739128 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.880310 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:52:55 crc kubenswrapper[4739]: I1008 22:52:55.892599 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x54q2_6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669/marketplace-operator/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.078953 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.081921 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.098569 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.358467 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.410232 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.410257 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.423495 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/registry-server/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.530559 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.593612 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.595561 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.754654 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:52:56 crc kubenswrapper[4739]: I1008 22:52:56.757377 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:52:57 crc kubenswrapper[4739]: I1008 22:52:57.537007 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/registry-server/0.log" Oct 08 22:52:58 crc kubenswrapper[4739]: I1008 22:52:58.821605 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:52:58 crc kubenswrapper[4739]: E1008 22:52:58.822095 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:53:10 crc kubenswrapper[4739]: I1008 22:53:10.524953 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-7fkfg_bbafdc6e-b606-4274-aebb-eb1d38bf693e/prometheus-operator/0.log" Oct 08 22:53:10 crc kubenswrapper[4739]: I1008 22:53:10.698690 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_711de91c-2cc4-4161-ac30-0de8e68283d5/prometheus-operator-admission-webhook/0.log" Oct 08 22:53:10 crc kubenswrapper[4739]: I1008 22:53:10.765377 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2/prometheus-operator-admission-webhook/0.log" Oct 08 22:53:10 crc kubenswrapper[4739]: I1008 22:53:10.940015 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-bpxcp_5c19c761-fcc4-474d-9d87-7c2e07755190/perses-operator/0.log" Oct 08 22:53:10 crc kubenswrapper[4739]: I1008 22:53:10.960733 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-6kpbp_4c13b744-b744-49f3-8ba5-241ab69fdab9/operator/0.log" Oct 08 22:53:13 crc kubenswrapper[4739]: I1008 22:53:13.822081 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:53:13 crc kubenswrapper[4739]: E1008 22:53:13.822931 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:53:24 crc kubenswrapper[4739]: I1008 22:53:24.904645 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/kube-rbac-proxy/0.log" Oct 08 22:53:24 crc kubenswrapper[4739]: I1008 22:53:24.964422 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/manager/0.log" Oct 08 22:53:27 crc kubenswrapper[4739]: I1008 22:53:27.822721 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:53:27 crc kubenswrapper[4739]: E1008 22:53:27.824119 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.640810 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:35 crc kubenswrapper[4739]: E1008 22:53:35.643125 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c264c17-c65a-432c-ae00-52ccb5a52202" containerName="container-00" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.643252 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c264c17-c65a-432c-ae00-52ccb5a52202" containerName="container-00" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.643559 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c264c17-c65a-432c-ae00-52ccb5a52202" containerName="container-00" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.645415 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.654743 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.694591 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcw9\" (UniqueName: \"kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.694675 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.694774 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.796543 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcw9\" (UniqueName: \"kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.796594 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.796624 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.797432 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.797534 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.829801 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcw9\" (UniqueName: \"kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9\") pod \"redhat-operators-p7pvx\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:35 crc kubenswrapper[4739]: I1008 22:53:35.962545 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:36 crc kubenswrapper[4739]: I1008 22:53:36.604212 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:37 crc kubenswrapper[4739]: I1008 22:53:37.466790 4739 generic.go:334] "Generic (PLEG): container finished" podID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerID="ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00" exitCode=0 Oct 08 22:53:37 crc kubenswrapper[4739]: I1008 22:53:37.467367 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerDied","Data":"ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00"} Oct 08 22:53:37 crc kubenswrapper[4739]: I1008 22:53:37.467393 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerStarted","Data":"e1d94f830d14faccb99e45a23836c6e045be1dcbfb416f942baa608082fd5c2d"} Oct 08 22:53:38 crc kubenswrapper[4739]: I1008 22:53:38.485271 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerStarted","Data":"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782"} Oct 08 22:53:39 crc kubenswrapper[4739]: I1008 22:53:39.822235 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:53:39 crc kubenswrapper[4739]: E1008 22:53:39.822820 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:53:44 crc kubenswrapper[4739]: I1008 22:53:44.544872 4739 generic.go:334] "Generic (PLEG): container finished" podID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerID="cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782" exitCode=0 Oct 08 22:53:44 crc kubenswrapper[4739]: I1008 22:53:44.544962 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerDied","Data":"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782"} Oct 08 22:53:45 crc kubenswrapper[4739]: I1008 22:53:45.556231 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerStarted","Data":"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64"} Oct 08 22:53:45 crc kubenswrapper[4739]: I1008 22:53:45.589185 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7pvx" podStartSLOduration=3.090888614 podStartE2EDuration="10.589161706s" podCreationTimestamp="2025-10-08 22:53:35 +0000 UTC" firstStartedPulling="2025-10-08 22:53:37.469959191 +0000 UTC m=+3917.295344941" lastFinishedPulling="2025-10-08 22:53:44.968232283 +0000 UTC m=+3924.793618033" observedRunningTime="2025-10-08 22:53:45.581042247 +0000 UTC m=+3925.406428007" watchObservedRunningTime="2025-10-08 22:53:45.589161706 +0000 UTC m=+3925.414547456" Oct 08 22:53:45 crc kubenswrapper[4739]: I1008 22:53:45.963476 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:45 crc kubenswrapper[4739]: I1008 22:53:45.963719 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:47 crc kubenswrapper[4739]: I1008 22:53:47.011030 4739 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p7pvx" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="registry-server" probeResult="failure" output=< Oct 08 22:53:47 crc kubenswrapper[4739]: timeout: failed to connect service ":50051" within 1s Oct 08 22:53:47 crc kubenswrapper[4739]: > Oct 08 22:53:53 crc kubenswrapper[4739]: I1008 22:53:53.823022 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:53:53 crc kubenswrapper[4739]: E1008 22:53:53.824657 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:53:56 crc kubenswrapper[4739]: I1008 22:53:56.053280 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:56 crc kubenswrapper[4739]: I1008 22:53:56.105204 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:56 crc kubenswrapper[4739]: I1008 22:53:56.302607 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:57 crc kubenswrapper[4739]: I1008 22:53:57.692574 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7pvx" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="registry-server" containerID="cri-o://a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64" gracePeriod=2 Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.315021 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.401925 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities\") pod \"17cf8b45-a966-4225-bc12-ab94c006f2c9\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.402042 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jcw9\" (UniqueName: \"kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9\") pod \"17cf8b45-a966-4225-bc12-ab94c006f2c9\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.402291 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") pod \"17cf8b45-a966-4225-bc12-ab94c006f2c9\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.402855 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities" (OuterVolumeSpecName: "utilities") pod "17cf8b45-a966-4225-bc12-ab94c006f2c9" (UID: "17cf8b45-a966-4225-bc12-ab94c006f2c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.408504 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9" (OuterVolumeSpecName: "kube-api-access-7jcw9") pod "17cf8b45-a966-4225-bc12-ab94c006f2c9" (UID: "17cf8b45-a966-4225-bc12-ab94c006f2c9"). InnerVolumeSpecName "kube-api-access-7jcw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.503154 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17cf8b45-a966-4225-bc12-ab94c006f2c9" (UID: "17cf8b45-a966-4225-bc12-ab94c006f2c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.503810 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") pod \"17cf8b45-a966-4225-bc12-ab94c006f2c9\" (UID: \"17cf8b45-a966-4225-bc12-ab94c006f2c9\") " Oct 08 22:53:58 crc kubenswrapper[4739]: W1008 22:53:58.504304 4739 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/17cf8b45-a966-4225-bc12-ab94c006f2c9/volumes/kubernetes.io~empty-dir/catalog-content Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.504331 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17cf8b45-a966-4225-bc12-ab94c006f2c9" (UID: "17cf8b45-a966-4225-bc12-ab94c006f2c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.504487 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jcw9\" (UniqueName: \"kubernetes.io/projected/17cf8b45-a966-4225-bc12-ab94c006f2c9-kube-api-access-7jcw9\") on node \"crc\" DevicePath \"\"" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.504506 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.504515 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17cf8b45-a966-4225-bc12-ab94c006f2c9-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.705188 4739 generic.go:334] "Generic (PLEG): container finished" podID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerID="a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64" exitCode=0 Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.705231 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerDied","Data":"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64"} Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.705265 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7pvx" event={"ID":"17cf8b45-a966-4225-bc12-ab94c006f2c9","Type":"ContainerDied","Data":"e1d94f830d14faccb99e45a23836c6e045be1dcbfb416f942baa608082fd5c2d"} Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.705285 4739 scope.go:117] "RemoveContainer" containerID="a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.705446 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7pvx" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.745932 4739 scope.go:117] "RemoveContainer" containerID="cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.746974 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.758991 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7pvx"] Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.767909 4739 scope.go:117] "RemoveContainer" containerID="ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.828762 4739 scope.go:117] "RemoveContainer" containerID="a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64" Oct 08 22:53:58 crc kubenswrapper[4739]: E1008 22:53:58.829205 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64\": container with ID starting with a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64 not found: ID does not exist" containerID="a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.829248 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64"} err="failed to get container status \"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64\": rpc error: code = NotFound desc = could not find container \"a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64\": container with ID starting with a853d3eef6d468622c5bb98b041d33ff912a63b9088bfaab6869aeb7b470ad64 not found: ID does not exist" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.829274 4739 scope.go:117] "RemoveContainer" containerID="cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782" Oct 08 22:53:58 crc kubenswrapper[4739]: E1008 22:53:58.829576 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782\": container with ID starting with cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782 not found: ID does not exist" containerID="cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.829599 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782"} err="failed to get container status \"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782\": rpc error: code = NotFound desc = could not find container \"cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782\": container with ID starting with cf9ae1e0a2c1052a66ecd1506cda853e71a885162181d78c4cb74718f9910782 not found: ID does not exist" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.829635 4739 scope.go:117] "RemoveContainer" containerID="ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00" Oct 08 22:53:58 crc kubenswrapper[4739]: E1008 22:53:58.829875 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00\": container with ID starting with ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00 not found: ID does not exist" containerID="ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00" Oct 08 22:53:58 crc kubenswrapper[4739]: I1008 22:53:58.829919 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00"} err="failed to get container status \"ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00\": rpc error: code = NotFound desc = could not find container \"ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00\": container with ID starting with ab58bb8738661ff71b0c3a26d47351deb2de03f990c0077d17bbc038970e9a00 not found: ID does not exist" Oct 08 22:53:59 crc kubenswrapper[4739]: I1008 22:53:59.834348 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" path="/var/lib/kubelet/pods/17cf8b45-a966-4225-bc12-ab94c006f2c9/volumes" Oct 08 22:54:07 crc kubenswrapper[4739]: I1008 22:54:07.821402 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:54:07 crc kubenswrapper[4739]: E1008 22:54:07.822114 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:54:22 crc kubenswrapper[4739]: I1008 22:54:22.821678 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:54:22 crc kubenswrapper[4739]: E1008 22:54:22.822536 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:54:34 crc kubenswrapper[4739]: I1008 22:54:34.822388 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:54:34 crc kubenswrapper[4739]: E1008 22:54:34.823318 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:54:46 crc kubenswrapper[4739]: I1008 22:54:46.823475 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:54:46 crc kubenswrapper[4739]: E1008 22:54:46.824826 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.381302 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:54:53 crc kubenswrapper[4739]: E1008 22:54:53.383723 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="registry-server" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.383827 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="registry-server" Oct 08 22:54:53 crc kubenswrapper[4739]: E1008 22:54:53.383929 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="extract-utilities" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.384004 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="extract-utilities" Oct 08 22:54:53 crc kubenswrapper[4739]: E1008 22:54:53.384159 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="extract-content" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.384242 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="extract-content" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.385054 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cf8b45-a966-4225-bc12-ab94c006f2c9" containerName="registry-server" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.387209 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.397560 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.574781 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.575425 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.575468 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9bz\" (UniqueName: \"kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.677105 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.677216 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.677256 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9bz\" (UniqueName: \"kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.677960 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.678063 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.696238 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9bz\" (UniqueName: \"kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz\") pod \"certified-operators-7vvkt\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:53 crc kubenswrapper[4739]: I1008 22:54:53.723901 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:54:54 crc kubenswrapper[4739]: I1008 22:54:54.253748 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:54:54 crc kubenswrapper[4739]: I1008 22:54:54.285515 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerStarted","Data":"5b05aba000f8ac9150e24146cca451ae49fd4c545772679b29b1cebe22b901b2"} Oct 08 22:54:55 crc kubenswrapper[4739]: I1008 22:54:55.295440 4739 generic.go:334] "Generic (PLEG): container finished" podID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerID="73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a" exitCode=0 Oct 08 22:54:55 crc kubenswrapper[4739]: I1008 22:54:55.295586 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerDied","Data":"73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a"} Oct 08 22:54:55 crc kubenswrapper[4739]: I1008 22:54:55.297688 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:54:57 crc kubenswrapper[4739]: I1008 22:54:57.317786 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerStarted","Data":"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71"} Oct 08 22:54:58 crc kubenswrapper[4739]: I1008 22:54:58.328532 4739 generic.go:334] "Generic (PLEG): container finished" podID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerID="91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71" exitCode=0 Oct 08 22:54:58 crc kubenswrapper[4739]: I1008 22:54:58.328633 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerDied","Data":"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71"} Oct 08 22:54:59 crc kubenswrapper[4739]: I1008 22:54:59.341662 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerStarted","Data":"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2"} Oct 08 22:54:59 crc kubenswrapper[4739]: I1008 22:54:59.366602 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vvkt" podStartSLOduration=2.953857762 podStartE2EDuration="6.366575004s" podCreationTimestamp="2025-10-08 22:54:53 +0000 UTC" firstStartedPulling="2025-10-08 22:54:55.297465879 +0000 UTC m=+3995.122851629" lastFinishedPulling="2025-10-08 22:54:58.710183121 +0000 UTC m=+3998.535568871" observedRunningTime="2025-10-08 22:54:59.363511439 +0000 UTC m=+3999.188897199" watchObservedRunningTime="2025-10-08 22:54:59.366575004 +0000 UTC m=+3999.191960764" Oct 08 22:55:00 crc kubenswrapper[4739]: I1008 22:55:00.822668 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:55:00 crc kubenswrapper[4739]: E1008 22:55:00.823324 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:55:03 crc kubenswrapper[4739]: I1008 22:55:03.724760 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:03 crc kubenswrapper[4739]: I1008 22:55:03.725602 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:03 crc kubenswrapper[4739]: I1008 22:55:03.771618 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:04 crc kubenswrapper[4739]: I1008 22:55:04.465867 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:04 crc kubenswrapper[4739]: I1008 22:55:04.510782 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:55:05 crc kubenswrapper[4739]: I1008 22:55:05.423181 4739 generic.go:334] "Generic (PLEG): container finished" podID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerID="23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a" exitCode=0 Oct 08 22:55:05 crc kubenswrapper[4739]: I1008 22:55:05.424162 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pnz9/must-gather-xcjps" event={"ID":"3640a123-c313-4e96-a7df-5d38f7fd34f3","Type":"ContainerDied","Data":"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a"} Oct 08 22:55:05 crc kubenswrapper[4739]: I1008 22:55:05.424504 4739 scope.go:117] "RemoveContainer" containerID="23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a" Oct 08 22:55:05 crc kubenswrapper[4739]: I1008 22:55:05.646695 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pnz9_must-gather-xcjps_3640a123-c313-4e96-a7df-5d38f7fd34f3/gather/0.log" Oct 08 22:55:06 crc kubenswrapper[4739]: I1008 22:55:06.438613 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vvkt" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="registry-server" containerID="cri-o://0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2" gracePeriod=2 Oct 08 22:55:06 crc kubenswrapper[4739]: I1008 22:55:06.970475 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.066709 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9bz\" (UniqueName: \"kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz\") pod \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.066938 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content\") pod \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.066972 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities\") pod \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\" (UID: \"646ecfcf-49d3-4bc9-a508-b365231cc3a6\") " Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.067920 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities" (OuterVolumeSpecName: "utilities") pod "646ecfcf-49d3-4bc9-a508-b365231cc3a6" (UID: "646ecfcf-49d3-4bc9-a508-b365231cc3a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.073951 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz" (OuterVolumeSpecName: "kube-api-access-tw9bz") pod "646ecfcf-49d3-4bc9-a508-b365231cc3a6" (UID: "646ecfcf-49d3-4bc9-a508-b365231cc3a6"). InnerVolumeSpecName "kube-api-access-tw9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.122353 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "646ecfcf-49d3-4bc9-a508-b365231cc3a6" (UID: "646ecfcf-49d3-4bc9-a508-b365231cc3a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.169362 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9bz\" (UniqueName: \"kubernetes.io/projected/646ecfcf-49d3-4bc9-a508-b365231cc3a6-kube-api-access-tw9bz\") on node \"crc\" DevicePath \"\"" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.169398 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.169410 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/646ecfcf-49d3-4bc9-a508-b365231cc3a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.454337 4739 generic.go:334] "Generic (PLEG): container finished" podID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerID="0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2" exitCode=0 Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.454448 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vvkt" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.454445 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerDied","Data":"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2"} Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.454937 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vvkt" event={"ID":"646ecfcf-49d3-4bc9-a508-b365231cc3a6","Type":"ContainerDied","Data":"5b05aba000f8ac9150e24146cca451ae49fd4c545772679b29b1cebe22b901b2"} Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.454971 4739 scope.go:117] "RemoveContainer" containerID="0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.499212 4739 scope.go:117] "RemoveContainer" containerID="91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.504745 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.513799 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vvkt"] Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.525104 4739 scope.go:117] "RemoveContainer" containerID="73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.598981 4739 scope.go:117] "RemoveContainer" containerID="0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2" Oct 08 22:55:07 crc kubenswrapper[4739]: E1008 22:55:07.599839 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2\": container with ID starting with 0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2 not found: ID does not exist" containerID="0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.599881 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2"} err="failed to get container status \"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2\": rpc error: code = NotFound desc = could not find container \"0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2\": container with ID starting with 0f55297636a3b79757acf00195e86d0386c269281c984e872fec711f3c3955c2 not found: ID does not exist" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.599908 4739 scope.go:117] "RemoveContainer" containerID="91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71" Oct 08 22:55:07 crc kubenswrapper[4739]: E1008 22:55:07.600380 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71\": container with ID starting with 91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71 not found: ID does not exist" containerID="91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.600763 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71"} err="failed to get container status \"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71\": rpc error: code = NotFound desc = could not find container \"91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71\": container with ID starting with 91bb36bb9cb15f099c23759b85ce3d78116cda38d07f2643db89a7dbea2b3a71 not found: ID does not exist" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.601028 4739 scope.go:117] "RemoveContainer" containerID="73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a" Oct 08 22:55:07 crc kubenswrapper[4739]: E1008 22:55:07.605459 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a\": container with ID starting with 73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a not found: ID does not exist" containerID="73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.605571 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a"} err="failed to get container status \"73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a\": rpc error: code = NotFound desc = could not find container \"73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a\": container with ID starting with 73c043eb99e713608e07abc33adfbe87adacf5133dc7aa2dc948c5bfa2b8f43a not found: ID does not exist" Oct 08 22:55:07 crc kubenswrapper[4739]: I1008 22:55:07.835065 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" path="/var/lib/kubelet/pods/646ecfcf-49d3-4bc9-a508-b365231cc3a6/volumes" Oct 08 22:55:11 crc kubenswrapper[4739]: I1008 22:55:11.828067 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:55:11 crc kubenswrapper[4739]: E1008 22:55:11.828933 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:55:13 crc kubenswrapper[4739]: I1008 22:55:13.671006 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7pnz9/must-gather-xcjps"] Oct 08 22:55:13 crc kubenswrapper[4739]: I1008 22:55:13.671483 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7pnz9/must-gather-xcjps" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="copy" containerID="cri-o://dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0" gracePeriod=2 Oct 08 22:55:13 crc kubenswrapper[4739]: I1008 22:55:13.688830 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7pnz9/must-gather-xcjps"] Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.215340 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pnz9_must-gather-xcjps_3640a123-c313-4e96-a7df-5d38f7fd34f3/copy/0.log" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.216339 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.356715 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output\") pod \"3640a123-c313-4e96-a7df-5d38f7fd34f3\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.356880 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2w7n\" (UniqueName: \"kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n\") pod \"3640a123-c313-4e96-a7df-5d38f7fd34f3\" (UID: \"3640a123-c313-4e96-a7df-5d38f7fd34f3\") " Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.365293 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n" (OuterVolumeSpecName: "kube-api-access-c2w7n") pod "3640a123-c313-4e96-a7df-5d38f7fd34f3" (UID: "3640a123-c313-4e96-a7df-5d38f7fd34f3"). InnerVolumeSpecName "kube-api-access-c2w7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.460498 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2w7n\" (UniqueName: \"kubernetes.io/projected/3640a123-c313-4e96-a7df-5d38f7fd34f3-kube-api-access-c2w7n\") on node \"crc\" DevicePath \"\"" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.529976 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3640a123-c313-4e96-a7df-5d38f7fd34f3" (UID: "3640a123-c313-4e96-a7df-5d38f7fd34f3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.532963 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7pnz9_must-gather-xcjps_3640a123-c313-4e96-a7df-5d38f7fd34f3/copy/0.log" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.533492 4739 generic.go:334] "Generic (PLEG): container finished" podID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerID="dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0" exitCode=143 Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.533547 4739 scope.go:117] "RemoveContainer" containerID="dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.533671 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pnz9/must-gather-xcjps" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.558251 4739 scope.go:117] "RemoveContainer" containerID="23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.562572 4739 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3640a123-c313-4e96-a7df-5d38f7fd34f3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.695052 4739 scope.go:117] "RemoveContainer" containerID="dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0" Oct 08 22:55:14 crc kubenswrapper[4739]: E1008 22:55:14.695611 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0\": container with ID starting with dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0 not found: ID does not exist" containerID="dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.695666 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0"} err="failed to get container status \"dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0\": rpc error: code = NotFound desc = could not find container \"dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0\": container with ID starting with dfaa773969f741a2671b519ad58d8b624fd4e49ec993e21ddac9e50b46be6ef0 not found: ID does not exist" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.695698 4739 scope.go:117] "RemoveContainer" containerID="23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a" Oct 08 22:55:14 crc kubenswrapper[4739]: E1008 22:55:14.696375 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a\": container with ID starting with 23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a not found: ID does not exist" containerID="23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a" Oct 08 22:55:14 crc kubenswrapper[4739]: I1008 22:55:14.696399 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a"} err="failed to get container status \"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a\": rpc error: code = NotFound desc = could not find container \"23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a\": container with ID starting with 23a52af77751f657e141c4b042fc211145d3a7d38a7b36eca4be2cf92349085a not found: ID does not exist" Oct 08 22:55:15 crc kubenswrapper[4739]: I1008 22:55:15.834292 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" path="/var/lib/kubelet/pods/3640a123-c313-4e96-a7df-5d38f7fd34f3/volumes" Oct 08 22:55:24 crc kubenswrapper[4739]: I1008 22:55:24.821179 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:55:24 crc kubenswrapper[4739]: E1008 22:55:24.822023 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:55:38 crc kubenswrapper[4739]: I1008 22:55:38.822291 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:55:38 crc kubenswrapper[4739]: E1008 22:55:38.823028 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.068543 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xwsg/must-gather-jh2q7"] Oct 08 22:55:51 crc kubenswrapper[4739]: E1008 22:55:51.069613 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="registry-server" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.069712 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="registry-server" Oct 08 22:55:51 crc kubenswrapper[4739]: E1008 22:55:51.069748 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="extract-content" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.069756 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="extract-content" Oct 08 22:55:51 crc kubenswrapper[4739]: E1008 22:55:51.069774 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="copy" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.069783 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="copy" Oct 08 22:55:51 crc kubenswrapper[4739]: E1008 22:55:51.069795 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="gather" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.069802 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="gather" Oct 08 22:55:51 crc kubenswrapper[4739]: E1008 22:55:51.069822 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="extract-utilities" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.069829 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="extract-utilities" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.070077 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="copy" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.070095 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="3640a123-c313-4e96-a7df-5d38f7fd34f3" containerName="gather" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.070113 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="646ecfcf-49d3-4bc9-a508-b365231cc3a6" containerName="registry-server" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.071548 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.073179 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7xwsg"/"kube-root-ca.crt" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.073398 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7xwsg"/"openshift-service-ca.crt" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.073890 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7xwsg"/"default-dockercfg-bnvk8" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.100056 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xwsg/must-gather-jh2q7"] Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.212452 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvbw\" (UniqueName: \"kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.212971 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.314638 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.314697 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvbw\" (UniqueName: \"kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.315107 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.333773 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvbw\" (UniqueName: \"kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw\") pod \"must-gather-jh2q7\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.398301 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 22:55:51 crc kubenswrapper[4739]: I1008 22:55:51.986649 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xwsg/must-gather-jh2q7"] Oct 08 22:55:52 crc kubenswrapper[4739]: I1008 22:55:52.824380 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:55:52 crc kubenswrapper[4739]: E1008 22:55:52.824952 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:55:52 crc kubenswrapper[4739]: I1008 22:55:52.965202 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" event={"ID":"a42172b8-81cc-43e2-9733-25b845571bf9","Type":"ContainerStarted","Data":"91f517e18e4050a09cab149d9deda946f3572e23f8134f1adf603701a2d02355"} Oct 08 22:55:52 crc kubenswrapper[4739]: I1008 22:55:52.965247 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" event={"ID":"a42172b8-81cc-43e2-9733-25b845571bf9","Type":"ContainerStarted","Data":"72933523cc3fde577b36eeeb6f81691b68093fd7babea27d9c72f4f7567f2837"} Oct 08 22:55:52 crc kubenswrapper[4739]: I1008 22:55:52.965257 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" event={"ID":"a42172b8-81cc-43e2-9733-25b845571bf9","Type":"ContainerStarted","Data":"6382f6de069935a2a85c91e27aaba14313dc4c9f9831dc61cfd4067468fb1a34"} Oct 08 22:55:52 crc kubenswrapper[4739]: I1008 22:55:52.983958 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" podStartSLOduration=1.9839388470000001 podStartE2EDuration="1.983938847s" podCreationTimestamp="2025-10-08 22:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:55:52.976790922 +0000 UTC m=+4052.802176682" watchObservedRunningTime="2025-10-08 22:55:52.983938847 +0000 UTC m=+4052.809324587" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.681353 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-kcsct"] Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.683847 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.811336 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.811632 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f2l\" (UniqueName: \"kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.913841 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.913994 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.914281 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f2l\" (UniqueName: \"kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:55 crc kubenswrapper[4739]: I1008 22:55:55.946245 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f2l\" (UniqueName: \"kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l\") pod \"crc-debug-kcsct\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:56 crc kubenswrapper[4739]: I1008 22:55:56.003581 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:55:56 crc kubenswrapper[4739]: W1008 22:55:56.035652 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e9c291_adf4_4d9d_b4ff_5123254a5632.slice/crio-9559c834ef286bc98761436464b1e80a1b212e01469039ed289acdd8bde2ccd6 WatchSource:0}: Error finding container 9559c834ef286bc98761436464b1e80a1b212e01469039ed289acdd8bde2ccd6: Status 404 returned error can't find the container with id 9559c834ef286bc98761436464b1e80a1b212e01469039ed289acdd8bde2ccd6 Oct 08 22:55:57 crc kubenswrapper[4739]: I1008 22:55:57.007930 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" event={"ID":"67e9c291-adf4-4d9d-b4ff-5123254a5632","Type":"ContainerStarted","Data":"07836bf39f714842b0ee7ce311ee0da88ce783fd64988b65d5ba17dfa1a2eedb"} Oct 08 22:55:57 crc kubenswrapper[4739]: I1008 22:55:57.008489 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" event={"ID":"67e9c291-adf4-4d9d-b4ff-5123254a5632","Type":"ContainerStarted","Data":"9559c834ef286bc98761436464b1e80a1b212e01469039ed289acdd8bde2ccd6"} Oct 08 22:55:57 crc kubenswrapper[4739]: I1008 22:55:57.031911 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" podStartSLOduration=2.031894464 podStartE2EDuration="2.031894464s" podCreationTimestamp="2025-10-08 22:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:55:57.02966924 +0000 UTC m=+4056.855054990" watchObservedRunningTime="2025-10-08 22:55:57.031894464 +0000 UTC m=+4056.857280214" Oct 08 22:56:06 crc kubenswrapper[4739]: I1008 22:56:06.821305 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:56:06 crc kubenswrapper[4739]: E1008 22:56:06.822100 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:56:21 crc kubenswrapper[4739]: I1008 22:56:21.829452 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:56:21 crc kubenswrapper[4739]: E1008 22:56:21.830363 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.176530 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.181966 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.195346 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.255089 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2k5j\" (UniqueName: \"kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.255432 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.255617 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.357446 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.357568 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2k5j\" (UniqueName: \"kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.357685 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.357878 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.357967 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.378405 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2k5j\" (UniqueName: \"kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j\") pod \"redhat-marketplace-45zxr\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:32 crc kubenswrapper[4739]: I1008 22:56:32.507997 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.046107 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.347040 4739 generic.go:334] "Generic (PLEG): container finished" podID="67e9c291-adf4-4d9d-b4ff-5123254a5632" containerID="07836bf39f714842b0ee7ce311ee0da88ce783fd64988b65d5ba17dfa1a2eedb" exitCode=0 Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.347077 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" event={"ID":"67e9c291-adf4-4d9d-b4ff-5123254a5632","Type":"ContainerDied","Data":"07836bf39f714842b0ee7ce311ee0da88ce783fd64988b65d5ba17dfa1a2eedb"} Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.349696 4739 generic.go:334] "Generic (PLEG): container finished" podID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerID="47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992" exitCode=0 Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.349752 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerDied","Data":"47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992"} Oct 08 22:56:33 crc kubenswrapper[4739]: I1008 22:56:33.349869 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerStarted","Data":"83545a0a097a16531a0dd0d44781c1fc4dc9c81b34c6a5c448ea5f6d5aed8bcb"} Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.504259 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.538995 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-kcsct"] Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.549895 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-kcsct"] Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.600682 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4f2l\" (UniqueName: \"kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l\") pod \"67e9c291-adf4-4d9d-b4ff-5123254a5632\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.600806 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host\") pod \"67e9c291-adf4-4d9d-b4ff-5123254a5632\" (UID: \"67e9c291-adf4-4d9d-b4ff-5123254a5632\") " Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.600882 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host" (OuterVolumeSpecName: "host") pod "67e9c291-adf4-4d9d-b4ff-5123254a5632" (UID: "67e9c291-adf4-4d9d-b4ff-5123254a5632"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.601495 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67e9c291-adf4-4d9d-b4ff-5123254a5632-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.607404 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l" (OuterVolumeSpecName: "kube-api-access-k4f2l") pod "67e9c291-adf4-4d9d-b4ff-5123254a5632" (UID: "67e9c291-adf4-4d9d-b4ff-5123254a5632"). InnerVolumeSpecName "kube-api-access-k4f2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.703211 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4f2l\" (UniqueName: \"kubernetes.io/projected/67e9c291-adf4-4d9d-b4ff-5123254a5632-kube-api-access-k4f2l\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:34 crc kubenswrapper[4739]: I1008 22:56:34.778671 4739 scope.go:117] "RemoveContainer" containerID="640314485b2c1da5df0c9b336e101c5f16c459ec8fd289d79eb6a4b4d07de030" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.366959 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9559c834ef286bc98761436464b1e80a1b212e01469039ed289acdd8bde2ccd6" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.366982 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-kcsct" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.368869 4739 generic.go:334] "Generic (PLEG): container finished" podID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerID="b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547" exitCode=0 Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.368900 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerDied","Data":"b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547"} Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.765970 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-x54pr"] Oct 08 22:56:35 crc kubenswrapper[4739]: E1008 22:56:35.766701 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e9c291-adf4-4d9d-b4ff-5123254a5632" containerName="container-00" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.766714 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e9c291-adf4-4d9d-b4ff-5123254a5632" containerName="container-00" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.766949 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e9c291-adf4-4d9d-b4ff-5123254a5632" containerName="container-00" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.767723 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.823517 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.823688 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcnq\" (UniqueName: \"kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.833925 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e9c291-adf4-4d9d-b4ff-5123254a5632" path="/var/lib/kubelet/pods/67e9c291-adf4-4d9d-b4ff-5123254a5632/volumes" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.925494 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcnq\" (UniqueName: \"kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.925701 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.925801 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:35 crc kubenswrapper[4739]: I1008 22:56:35.948976 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcnq\" (UniqueName: \"kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq\") pod \"crc-debug-x54pr\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:36 crc kubenswrapper[4739]: I1008 22:56:36.089957 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:36 crc kubenswrapper[4739]: W1008 22:56:36.143327 4739 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08edc61a_7633_4d23_8b9d_9045dd1e2766.slice/crio-76fccb5652114f34791a52a99be4755f3c708829f6cce6278291d88803d04a8f WatchSource:0}: Error finding container 76fccb5652114f34791a52a99be4755f3c708829f6cce6278291d88803d04a8f: Status 404 returned error can't find the container with id 76fccb5652114f34791a52a99be4755f3c708829f6cce6278291d88803d04a8f Oct 08 22:56:36 crc kubenswrapper[4739]: I1008 22:56:36.394504 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" event={"ID":"08edc61a-7633-4d23-8b9d-9045dd1e2766","Type":"ContainerStarted","Data":"76fccb5652114f34791a52a99be4755f3c708829f6cce6278291d88803d04a8f"} Oct 08 22:56:36 crc kubenswrapper[4739]: I1008 22:56:36.398440 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerStarted","Data":"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb"} Oct 08 22:56:36 crc kubenswrapper[4739]: I1008 22:56:36.420628 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-45zxr" podStartSLOduration=2.001791191 podStartE2EDuration="4.420609675s" podCreationTimestamp="2025-10-08 22:56:32 +0000 UTC" firstStartedPulling="2025-10-08 22:56:33.351503978 +0000 UTC m=+4093.176889728" lastFinishedPulling="2025-10-08 22:56:35.770322462 +0000 UTC m=+4095.595708212" observedRunningTime="2025-10-08 22:56:36.41467402 +0000 UTC m=+4096.240059770" watchObservedRunningTime="2025-10-08 22:56:36.420609675 +0000 UTC m=+4096.245995425" Oct 08 22:56:36 crc kubenswrapper[4739]: I1008 22:56:36.822508 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:56:36 crc kubenswrapper[4739]: E1008 22:56:36.822798 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:56:37 crc kubenswrapper[4739]: I1008 22:56:37.428441 4739 generic.go:334] "Generic (PLEG): container finished" podID="08edc61a-7633-4d23-8b9d-9045dd1e2766" containerID="d1912d0288f35b6218fea395f5c838028541a9f3cd8210c5dc0ce3b738e0f1da" exitCode=0 Oct 08 22:56:37 crc kubenswrapper[4739]: I1008 22:56:37.429597 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" event={"ID":"08edc61a-7633-4d23-8b9d-9045dd1e2766","Type":"ContainerDied","Data":"d1912d0288f35b6218fea395f5c838028541a9f3cd8210c5dc0ce3b738e0f1da"} Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.512480 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-x54pr"] Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.524989 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-x54pr"] Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.567031 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.677598 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host\") pod \"08edc61a-7633-4d23-8b9d-9045dd1e2766\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.677690 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host" (OuterVolumeSpecName: "host") pod "08edc61a-7633-4d23-8b9d-9045dd1e2766" (UID: "08edc61a-7633-4d23-8b9d-9045dd1e2766"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.677841 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmcnq\" (UniqueName: \"kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq\") pod \"08edc61a-7633-4d23-8b9d-9045dd1e2766\" (UID: \"08edc61a-7633-4d23-8b9d-9045dd1e2766\") " Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.678470 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08edc61a-7633-4d23-8b9d-9045dd1e2766-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.683062 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq" (OuterVolumeSpecName: "kube-api-access-tmcnq") pod "08edc61a-7633-4d23-8b9d-9045dd1e2766" (UID: "08edc61a-7633-4d23-8b9d-9045dd1e2766"). InnerVolumeSpecName "kube-api-access-tmcnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:56:38 crc kubenswrapper[4739]: I1008 22:56:38.780339 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmcnq\" (UniqueName: \"kubernetes.io/projected/08edc61a-7633-4d23-8b9d-9045dd1e2766-kube-api-access-tmcnq\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.447971 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76fccb5652114f34791a52a99be4755f3c708829f6cce6278291d88803d04a8f" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.448293 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-x54pr" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.756705 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-h7hgk"] Oct 08 22:56:39 crc kubenswrapper[4739]: E1008 22:56:39.757154 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08edc61a-7633-4d23-8b9d-9045dd1e2766" containerName="container-00" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.757169 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="08edc61a-7633-4d23-8b9d-9045dd1e2766" containerName="container-00" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.757360 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="08edc61a-7633-4d23-8b9d-9045dd1e2766" containerName="container-00" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.758131 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.812059 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.812110 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-284g2\" (UniqueName: \"kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.834113 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08edc61a-7633-4d23-8b9d-9045dd1e2766" path="/var/lib/kubelet/pods/08edc61a-7633-4d23-8b9d-9045dd1e2766/volumes" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.914581 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.914644 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-284g2\" (UniqueName: \"kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.915025 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:39 crc kubenswrapper[4739]: I1008 22:56:39.940798 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-284g2\" (UniqueName: \"kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2\") pod \"crc-debug-h7hgk\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.076260 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.458664 4739 generic.go:334] "Generic (PLEG): container finished" podID="38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" containerID="9a9f08bf10c07b7435a0a040b2ff8a2c2928498b3a5da2528bcc279dbb6ba523" exitCode=0 Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.458773 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" event={"ID":"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95","Type":"ContainerDied","Data":"9a9f08bf10c07b7435a0a040b2ff8a2c2928498b3a5da2528bcc279dbb6ba523"} Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.458985 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" event={"ID":"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95","Type":"ContainerStarted","Data":"0ba330f40beb0dc0ef38afc2a5543e86c2980b81106c619083ecd849158f8956"} Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.504523 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-h7hgk"] Oct 08 22:56:40 crc kubenswrapper[4739]: I1008 22:56:40.512604 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xwsg/crc-debug-h7hgk"] Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.591518 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.647544 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host\") pod \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.647623 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-284g2\" (UniqueName: \"kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2\") pod \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\" (UID: \"38b98de0-5ba8-4c4e-aca5-538d9ca1ad95\") " Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.647733 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host" (OuterVolumeSpecName: "host") pod "38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" (UID: "38b98de0-5ba8-4c4e-aca5-538d9ca1ad95"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.648325 4739 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-host\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.652916 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2" (OuterVolumeSpecName: "kube-api-access-284g2") pod "38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" (UID: "38b98de0-5ba8-4c4e-aca5-538d9ca1ad95"). InnerVolumeSpecName "kube-api-access-284g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.750404 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-284g2\" (UniqueName: \"kubernetes.io/projected/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95-kube-api-access-284g2\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:41 crc kubenswrapper[4739]: I1008 22:56:41.833868 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" path="/var/lib/kubelet/pods/38b98de0-5ba8-4c4e-aca5-538d9ca1ad95/volumes" Oct 08 22:56:42 crc kubenswrapper[4739]: I1008 22:56:42.483156 4739 scope.go:117] "RemoveContainer" containerID="9a9f08bf10c07b7435a0a040b2ff8a2c2928498b3a5da2528bcc279dbb6ba523" Oct 08 22:56:42 crc kubenswrapper[4739]: I1008 22:56:42.483207 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/crc-debug-h7hgk" Oct 08 22:56:42 crc kubenswrapper[4739]: I1008 22:56:42.509435 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:42 crc kubenswrapper[4739]: I1008 22:56:42.509502 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:42 crc kubenswrapper[4739]: I1008 22:56:42.578573 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:43 crc kubenswrapper[4739]: I1008 22:56:43.542395 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:43 crc kubenswrapper[4739]: I1008 22:56:43.599712 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:45 crc kubenswrapper[4739]: I1008 22:56:45.514444 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-45zxr" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="registry-server" containerID="cri-o://acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb" gracePeriod=2 Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.126257 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.241587 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2k5j\" (UniqueName: \"kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j\") pod \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.241753 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities\") pod \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.241813 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content\") pod \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\" (UID: \"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b\") " Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.242929 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities" (OuterVolumeSpecName: "utilities") pod "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" (UID: "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.248405 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j" (OuterVolumeSpecName: "kube-api-access-w2k5j") pod "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" (UID: "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b"). InnerVolumeSpecName "kube-api-access-w2k5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.264136 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" (UID: "091d8ecb-934d-4c4c-ae6c-5ab8da44d75b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.343912 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2k5j\" (UniqueName: \"kubernetes.io/projected/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-kube-api-access-w2k5j\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.343946 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.343956 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.525381 4739 generic.go:334] "Generic (PLEG): container finished" podID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerID="acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb" exitCode=0 Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.525422 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerDied","Data":"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb"} Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.525448 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45zxr" event={"ID":"091d8ecb-934d-4c4c-ae6c-5ab8da44d75b","Type":"ContainerDied","Data":"83545a0a097a16531a0dd0d44781c1fc4dc9c81b34c6a5c448ea5f6d5aed8bcb"} Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.525457 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45zxr" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.525464 4739 scope.go:117] "RemoveContainer" containerID="acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.561054 4739 scope.go:117] "RemoveContainer" containerID="b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.567011 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.576833 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-45zxr"] Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.584282 4739 scope.go:117] "RemoveContainer" containerID="47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.635910 4739 scope.go:117] "RemoveContainer" containerID="acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb" Oct 08 22:56:46 crc kubenswrapper[4739]: E1008 22:56:46.636471 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb\": container with ID starting with acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb not found: ID does not exist" containerID="acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.636525 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb"} err="failed to get container status \"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb\": rpc error: code = NotFound desc = could not find container \"acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb\": container with ID starting with acf4ad465eff054c6ac71a0ca3e94dbda6541ba77007903b60f7fceb88c5d2fb not found: ID does not exist" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.636562 4739 scope.go:117] "RemoveContainer" containerID="b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547" Oct 08 22:56:46 crc kubenswrapper[4739]: E1008 22:56:46.637039 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547\": container with ID starting with b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547 not found: ID does not exist" containerID="b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.637078 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547"} err="failed to get container status \"b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547\": rpc error: code = NotFound desc = could not find container \"b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547\": container with ID starting with b1c2c37550a759f03c4d757fb7f5ec78424c8ed536a87fc51c16f5a774a6a547 not found: ID does not exist" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.637101 4739 scope.go:117] "RemoveContainer" containerID="47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992" Oct 08 22:56:46 crc kubenswrapper[4739]: E1008 22:56:46.637423 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992\": container with ID starting with 47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992 not found: ID does not exist" containerID="47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992" Oct 08 22:56:46 crc kubenswrapper[4739]: I1008 22:56:46.637494 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992"} err="failed to get container status \"47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992\": rpc error: code = NotFound desc = could not find container \"47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992\": container with ID starting with 47d79a7f9968d8ea999daba7798571c2a5dc70a3322c550862a16ed9b7ea1992 not found: ID does not exist" Oct 08 22:56:47 crc kubenswrapper[4739]: I1008 22:56:47.826827 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:56:47 crc kubenswrapper[4739]: E1008 22:56:47.830553 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:56:47 crc kubenswrapper[4739]: I1008 22:56:47.868085 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" path="/var/lib/kubelet/pods/091d8ecb-934d-4c4c-ae6c-5ab8da44d75b/volumes" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.338208 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/init-config-reloader/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.552607 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/init-config-reloader/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.597360 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/alertmanager/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.603179 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_45e462d7-dbfe-4f05-a7d2-7bb97ca5a74a/config-reloader/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.761756 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56dcfd46c8-rpb55_9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59/barbican-api/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.806208 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56dcfd46c8-rpb55_9b1e3799-0c6d-4e4c-ac0d-f6b0d9241e59/barbican-api-log/0.log" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.822404 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:57:02 crc kubenswrapper[4739]: E1008 22:57:02.822611 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:57:02 crc kubenswrapper[4739]: I1008 22:57:02.878304 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-648fb84fdb-qmfb7_a8e9e5fe-49e3-4fea-8a8e-b853c479ce94/barbican-keystone-listener/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.053923 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66b4c9b85f-r8lds_1802465b-168a-449f-b8db-224a426d90ad/barbican-worker/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.055811 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-66b4c9b85f-r8lds_1802465b-168a-449f-b8db-224a426d90ad/barbican-worker-log/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.068648 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-648fb84fdb-qmfb7_a8e9e5fe-49e3-4fea-8a8e-b853c479ce94/barbican-keystone-listener-log/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.278311 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jbzm7_bd1f9e00-5ba4-4aa0-b38c-8610f396af0b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.331587 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/ceilometer-central-agent/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.423707 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/ceilometer-notification-agent/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.446400 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/proxy-httpd/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.483539 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e7a33944-ef03-44c5-91a6-45cf63c795f8/sg-core/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.627286 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9/cinder-api-log/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.691997 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60e3dd3d-1e46-4b2a-b71b-fbfb5d7a17a9/cinder-api/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.798558 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2c5f9170-35c8-4e75-ba48-955a58e56e3f/cinder-scheduler/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.861135 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2c5f9170-35c8-4e75-ba48-955a58e56e3f/probe/0.log" Oct 08 22:57:03 crc kubenswrapper[4739]: I1008 22:57:03.998908 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_43526663-2258-4b39-909c-1c52b4e217de/cloudkitty-api-log/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.045366 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_43526663-2258-4b39-909c-1c52b4e217de/cloudkitty-api/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.148846 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_02de38f3-1c70-4314-8dd7-4b5612c4348f/loki-compactor/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.273495 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-5dccl_cd624632-67d1-48e1-8c43-fa58f5d2e5ea/loki-distributor/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.318920 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-czjfz_4b1ae118-cd96-4e60-997d-9594acff7531/gateway/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.490584 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-nf2wl_0663f463-0160-4cc2-bad3-389baee708da/gateway/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.559002 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_30b9a3e4-bb8b-4ad4-9012-dc8a9250ebf4/loki-index-gateway/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.733425 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_9d41d47c-0875-4283-908d-559995e5069e/loki-ingester/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.768008 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-68bbd7984c-65fx4_f8016740-3857-4c88-81a3-6ee47b7e2a75/loki-querier/0.log" Oct 08 22:57:04 crc kubenswrapper[4739]: I1008 22:57:04.920784 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-v9qwb_137f65de-3030-4c4b-a087-c547dd183105/loki-query-frontend/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.279601 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dzlbg_b87a15bb-7744-4904-91b9-9f8052912033/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.524717 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_5de0557c-aa06-41d3-8d90-76d22496c164/cloudkitty-proc/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.540770 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-82kg2_23d9f12a-1fa5-4635-a3f5-5b3ba0e1ae2f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.637753 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grlzd_a3f415ab-75ff-469e-84f4-5d2e9f4053e2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.722706 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/init/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.869418 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/init/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.917415 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-nnbsx_b7adc6ab-b111-4d2a-a0f3-a1b50e53df52/dnsmasq-dns/0.log" Oct 08 22:57:05 crc kubenswrapper[4739]: I1008 22:57:05.937022 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rjvx9_b62c229c-107a-42de-8501-b52ae4c47f9f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.083763 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_66c23737-b27f-4ba2-9291-b2d0f3aa5020/glance-httpd/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.101330 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_66c23737-b27f-4ba2-9291-b2d0f3aa5020/glance-log/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.196960 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a00e6724-633b-4d60-9781-206e078a6dca/glance-httpd/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.210437 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a00e6724-633b-4d60-9781-206e078a6dca/glance-log/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.271595 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-f6s8s_f2642ecf-dc6d-4f4e-94e7-2f76db914748/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.498516 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4kldb_9b1fcab8-e84d-433d-ac57-62a00dc6f557/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.757126 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wthng_076ace7f-41ce-4825-9d2f-e49471648888/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:06 crc kubenswrapper[4739]: I1008 22:57:06.930919 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55468d9c4f-z8pn5_7923886e-2cbf-489b-aabd-aa49c710fbf0/keystone-api/0.log" Oct 08 22:57:07 crc kubenswrapper[4739]: I1008 22:57:07.080265 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dc6d4cfc5-7ks2h_f2a09a54-dd22-4b47-b5bd-49685c152d9f/neutron-httpd/0.log" Oct 08 22:57:07 crc kubenswrapper[4739]: I1008 22:57:07.201733 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dc6d4cfc5-7ks2h_f2a09a54-dd22-4b47-b5bd-49685c152d9f/neutron-api/0.log" Oct 08 22:57:07 crc kubenswrapper[4739]: I1008 22:57:07.207454 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fkj2f_40321558-aaa1-4ba3-8417-69c969745cfa/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:07 crc kubenswrapper[4739]: I1008 22:57:07.760843 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_16fa8c46-856e-465c-bd60-fb13b76e5079/nova-api-log/0.log" Oct 08 22:57:07 crc kubenswrapper[4739]: I1008 22:57:07.800264 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_383c8e08-0c0f-41fb-9574-cfa23aa2aad5/nova-cell0-conductor-conductor/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.118605 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_48ac9d79-441d-4277-bf99-a8dc4ec2213c/nova-cell1-conductor-conductor/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.194732 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b2183def-3ac0-434f-bca8-dfd66210d7ab/nova-cell1-novncproxy-novncproxy/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.248857 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_16fa8c46-856e-465c-bd60-fb13b76e5079/nova-api-api/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.385465 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-s2cpg_02cc9be0-080a-4ef8-a438-18607a5c7da4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.584330 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_99220499-d612-49e9-a7f1-622280a12221/nova-metadata-log/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.899618 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/mysql-bootstrap/0.log" Oct 08 22:57:08 crc kubenswrapper[4739]: I1008 22:57:08.930663 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_08b3472b-4c6f-4c1a-83b2-6c176dbbcbf3/nova-scheduler-scheduler/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.068108 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/mysql-bootstrap/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.081469 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2532902-2058-4c79-b612-fd2737190f3e/galera/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.298500 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/mysql-bootstrap/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.492455 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/galera/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.514067 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3c6fc5d3-c48a-4d83-97f8-38d56264d769/mysql-bootstrap/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.677271 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9e972dc2-2718-4dcd-a49a-9d3199e95d61/openstackclient/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.754051 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hld4g_bc4ea068-4061-435b-8e62-11b14a3e1ec4/openstack-network-exporter/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.894601 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_99220499-d612-49e9-a7f1-622280a12221/nova-metadata-metadata/0.log" Oct 08 22:57:09 crc kubenswrapper[4739]: I1008 22:57:09.909686 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mj9gb_0e7dd9f9-b0ec-4795-9ad5-d4787becd6fa/ovn-controller/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.097071 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server-init/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.315797 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.327507 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovs-vswitchd/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.346752 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-92kk7_eacfa01f-eb31-40c2-a163-3356c30772e3/ovsdb-server-init/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.578202 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c19423c-cec2-4fbf-b2bf-97a99db03043/openstack-network-exporter/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.580973 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zh4p8_97d1ee4d-475f-4607-b01d-3d51e6ab179e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.624066 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c19423c-cec2-4fbf-b2bf-97a99db03043/ovn-northd/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.772748 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_00537745-c30b-4fa9-be09-0edb09ff7138/ovsdbserver-nb/0.log" Oct 08 22:57:10 crc kubenswrapper[4739]: I1008 22:57:10.835985 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_00537745-c30b-4fa9-be09-0edb09ff7138/openstack-network-exporter/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.016364 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_15d5a814-0c23-4e0f-b750-9f886dc130b6/openstack-network-exporter/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.043289 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_15d5a814-0c23-4e0f-b750-9f886dc130b6/ovsdbserver-sb/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.208460 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c88568bb8-rh6ln_4de51af0-00c3-4a08-a13b-819a118cb604/placement-api/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.303389 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/init-config-reloader/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.351071 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c88568bb8-rh6ln_4de51af0-00c3-4a08-a13b-819a118cb604/placement-log/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.572677 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/init-config-reloader/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.575641 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/prometheus/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.584771 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/config-reloader/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.616052 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0bc93384-c08a-4c7f-9dc4-318126297a8b/thanos-sidecar/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.775638 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/setup-container/0.log" Oct 08 22:57:11 crc kubenswrapper[4739]: I1008 22:57:11.951678 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/setup-container/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.005984 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/setup-container/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.052835 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_744c6598-a814-45c7-bf47-5fe0b5b48c5e/rabbitmq/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.245161 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/setup-container/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.309520 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wmv48_8ad88d67-b089-4777-be25-7c61f66c18c7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.322321 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f4b843f8-cdaf-4b40-a1c2-b1b29c8cc0eb/rabbitmq/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.469572 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m9wkc_e04c955f-d97c-4cc9-a01e-3d4d2b59de12/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.544519 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bhkbr_5536fb34-0051-4845-98e8-050b8870274d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.803091 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p2w6r_3b70364c-a814-4e53-afb8-693faa5063ec/ssh-known-hosts-edpm-deployment/0.log" Oct 08 22:57:12 crc kubenswrapper[4739]: I1008 22:57:12.804007 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8nvgs_373291e0-4568-47e1-a71f-b2f005e5e557/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.046594 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c89ccbcd7-dlxrn_04e3fccb-ef13-4d04-9310-e1aec36adefe/proxy-server/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.187841 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-djtdv_2b6bdd10-ace2-453a-b0c9-d89051620215/swift-ring-rebalance/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.238633 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c89ccbcd7-dlxrn_04e3fccb-ef13-4d04-9310-e1aec36adefe/proxy-httpd/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.265332 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-auditor/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.457128 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-server/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.462735 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-replicator/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.481420 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-auditor/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.500975 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/account-reaper/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.684076 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-server/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.689527 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-replicator/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.706730 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/container-updater/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.758872 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-auditor/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.863201 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-expirer/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.885184 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-replicator/0.log" Oct 08 22:57:13 crc kubenswrapper[4739]: I1008 22:57:13.967238 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-server/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.023005 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/object-updater/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.050861 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/rsync/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.147745 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_229f0c98-b6d6-415b-b34a-6ffcd2a0ed52/swift-recon-cron/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.334170 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-br4sd_76b5c31e-7a34-42b9-9ad1-4f21fc560df3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.453597 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_def311ef-12ca-4d1a-972f-f2d72707a804/tempest-tests-tempest-tests-runner/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.610844 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cd1babc0-3114-448d-b296-2c2680e08553/test-operator-logs-container/0.log" Oct 08 22:57:14 crc kubenswrapper[4739]: I1008 22:57:14.695630 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kxrmb_4b344b99-c3a5-4d79-ad85-b8589d6489b0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 08 22:57:15 crc kubenswrapper[4739]: I1008 22:57:15.824779 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:57:15 crc kubenswrapper[4739]: E1008 22:57:15.825305 4739 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwvs2_openshift-machine-config-operator(9707b708-016c-4e06-86db-0332e2ca37db)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" Oct 08 22:57:23 crc kubenswrapper[4739]: I1008 22:57:23.895391 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b6da1726-555b-4905-b565-611392fb8e67/memcached/0.log" Oct 08 22:57:26 crc kubenswrapper[4739]: I1008 22:57:26.822758 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 22:57:27 crc kubenswrapper[4739]: I1008 22:57:27.946432 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"61ebb047d55ba2ada03295b962b140741fadb555af57a367f41a3bf5f65633ac"} Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.453772 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-2c9sc_4ecad090-144b-491d-9307-dd0d2db07490/manager/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.473914 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-2c9sc_4ecad090-144b-491d-9307-dd0d2db07490/kube-rbac-proxy/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.594530 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-zk8hf_5aa8c8d1-9588-4e0f-87e2-b44b072bef76/kube-rbac-proxy/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.664503 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-zk8hf_5aa8c8d1-9588-4e0f-87e2-b44b072bef76/manager/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.737351 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-qhqv6_17ff0c7d-5595-4d2f-b77d-0f6114746fae/kube-rbac-proxy/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.856230 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-qhqv6_17ff0c7d-5595-4d2f-b77d-0f6114746fae/manager/0.log" Oct 08 22:57:42 crc kubenswrapper[4739]: I1008 22:57:42.935121 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.119249 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.139495 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.151775 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.295662 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/pull/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.333933 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/extract/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.341786 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3a0c43d4dc9f2926144c62a74a50331fa7eea137e7cbeb4b439a6e364s8gpt_5bf9c162-c45e-48c6-9415-4f6e218895c0/util/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.484805 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wqmv7_8302b913-c934-4911-8c78-72d139019f33/kube-rbac-proxy/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.607190 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-xnfkp_fae90c53-9891-4664-8767-98bfab1e021a/kube-rbac-proxy/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.644376 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wqmv7_8302b913-c934-4911-8c78-72d139019f33/manager/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.695755 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-xnfkp_fae90c53-9891-4664-8767-98bfab1e021a/manager/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.796299 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-8djd9_dcd83df0-3381-4d08-9818-e7f91ba6f77b/kube-rbac-proxy/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.824180 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-8djd9_dcd83df0-3381-4d08-9818-e7f91ba6f77b/manager/0.log" Oct 08 22:57:43 crc kubenswrapper[4739]: I1008 22:57:43.973498 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-f7hv4_c11fdec0-87d4-41db-b5d3-66155b578abe/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.134986 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-f7hv4_c11fdec0-87d4-41db-b5d3-66155b578abe/manager/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.281686 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-6kp4x_f038b58d-e69a-481d-a0df-65211386c9da/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.364100 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-6kp4x_f038b58d-e69a-481d-a0df-65211386c9da/manager/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.474646 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-9ls4w_58ceaa51-6704-4f1d-8aa0-2053f1c7c89d/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.521430 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-9ls4w_58ceaa51-6704-4f1d-8aa0-2053f1c7c89d/manager/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.602023 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-l66d4_cf67473b-9a22-492a-844d-552fc946605d/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.676511 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-l66d4_cf67473b-9a22-492a-844d-552fc946605d/manager/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.788113 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-2sv26_102e8f33-2000-4a71-a337-4fa304d59e93/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.817280 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-2sv26_102e8f33-2000-4a71-a337-4fa304d59e93/manager/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.945460 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-fbhjp_7e483c3d-debb-4f41-a968-0d19d337e771/kube-rbac-proxy/0.log" Oct 08 22:57:44 crc kubenswrapper[4739]: I1008 22:57:44.971687 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-fbhjp_7e483c3d-debb-4f41-a968-0d19d337e771/manager/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.078928 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-8d2bb_23d030c7-6a61-4ba5-9b00-018f7370ea5d/kube-rbac-proxy/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.193133 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-8d2bb_23d030c7-6a61-4ba5-9b00-018f7370ea5d/manager/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.223934 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-wz5dg_d0a97575-d460-410d-84aa-887e6d809bba/kube-rbac-proxy/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.248886 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-wz5dg_d0a97575-d460-410d-84aa-887e6d809bba/manager/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.386283 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67_c165e9bc-4624-4227-8a87-835cbfe8a970/kube-rbac-proxy/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.471056 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dz6n67_c165e9bc-4624-4227-8a87-835cbfe8a970/manager/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.574392 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb8b8594d-rkq5g_effe458d-330b-4b50-9a32-bb44bc0008ca/kube-rbac-proxy/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.769323 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fbb97f6f4-lh6lp_f65a9137-93d0-424e-a839-3429f141ffa7/kube-rbac-proxy/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.875762 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fbb97f6f4-lh6lp_f65a9137-93d0-424e-a839-3429f141ffa7/operator/0.log" Oct 08 22:57:45 crc kubenswrapper[4739]: I1008 22:57:45.996609 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wfp9g_99e801cf-5f88-48ef-8193-516d2cc2bf14/registry-server/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.017366 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-qn2t5_41df7676-d0f5-47a1-a90c-2bc3bc01e18d/kube-rbac-proxy/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.237749 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-qn2t5_41df7676-d0f5-47a1-a90c-2bc3bc01e18d/manager/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.243503 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9vz2h_e51cba46-23fd-4f5f-819d-c2e0ee77a743/kube-rbac-proxy/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.281963 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9vz2h_e51cba46-23fd-4f5f-819d-c2e0ee77a743/manager/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.461616 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4msd5_0c0df1f2-5ae8-40e5-8aa8-893d2e0081cf/operator/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.573936 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-7gjhx_dd2b6037-9b5d-47cb-b057-d33546b8e74c/kube-rbac-proxy/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.685538 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-7gjhx_dd2b6037-9b5d-47cb-b057-d33546b8e74c/manager/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.720677 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75d7f5797c-czmc9_aff0dabb-b21e-4507-8a13-1d391b8c4f52/kube-rbac-proxy/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.789557 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb8b8594d-rkq5g_effe458d-330b-4b50-9a32-bb44bc0008ca/manager/0.log" Oct 08 22:57:46 crc kubenswrapper[4739]: I1008 22:57:46.944099 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-c6dtc_18bf5f4d-f183-41c2-b1c9-a965baab8f5d/kube-rbac-proxy/0.log" Oct 08 22:57:47 crc kubenswrapper[4739]: I1008 22:57:47.002646 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-75d7f5797c-czmc9_aff0dabb-b21e-4507-8a13-1d391b8c4f52/manager/0.log" Oct 08 22:57:47 crc kubenswrapper[4739]: I1008 22:57:47.033301 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74665f6cdc-c6dtc_18bf5f4d-f183-41c2-b1c9-a965baab8f5d/manager/0.log" Oct 08 22:57:47 crc kubenswrapper[4739]: I1008 22:57:47.166205 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-l9vl5_04c34d21-ba2d-4418-83e2-ba162c64cc1e/kube-rbac-proxy/0.log" Oct 08 22:57:47 crc kubenswrapper[4739]: I1008 22:57:47.167416 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5dd4499c96-l9vl5_04c34d21-ba2d-4418-83e2-ba162c64cc1e/manager/0.log" Oct 08 22:58:03 crc kubenswrapper[4739]: I1008 22:58:03.883390 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rb8dx_a0840280-c534-4e58-9095-a87e9acb799a/control-plane-machine-set-operator/0.log" Oct 08 22:58:04 crc kubenswrapper[4739]: I1008 22:58:04.007928 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8679r_abd1c1de-f12b-48d3-9687-54025a7daa56/kube-rbac-proxy/0.log" Oct 08 22:58:04 crc kubenswrapper[4739]: I1008 22:58:04.047727 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8679r_abd1c1de-f12b-48d3-9687-54025a7daa56/machine-api-operator/0.log" Oct 08 22:58:16 crc kubenswrapper[4739]: I1008 22:58:16.765088 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-69d28_1b95f6e0-c1ca-4d45-82df-49d302f081ec/cert-manager-controller/0.log" Oct 08 22:58:16 crc kubenswrapper[4739]: I1008 22:58:16.962129 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qqgbk_40bc8c1c-ef2f-4374-b80d-f402929336c3/cert-manager-cainjector/0.log" Oct 08 22:58:17 crc kubenswrapper[4739]: I1008 22:58:17.057325 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hmtjz_551e5827-2d65-48ab-90a0-6e46341e2292/cert-manager-webhook/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.408277 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-fcg68_d26121b1-736e-49d5-9241-bd8e8e7706c5/nmstate-console-plugin/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.511083 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lj8qk_64c2a4a6-267c-484e-b36f-95d7540531ef/nmstate-handler/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.594649 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pz8z2_ad883073-96a0-4558-9517-2f59f2e1472e/nmstate-metrics/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.595860 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pz8z2_ad883073-96a0-4558-9517-2f59f2e1472e/kube-rbac-proxy/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.789934 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-mt6zh_d8c93273-ced1-4664-9585-47ce49a29326/nmstate-operator/0.log" Oct 08 22:58:30 crc kubenswrapper[4739]: I1008 22:58:30.794502 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-pjkfn_dd0f36c5-926b-4678-b2d0-342a3f2f1d1f/nmstate-webhook/0.log" Oct 08 22:58:43 crc kubenswrapper[4739]: I1008 22:58:43.074110 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/kube-rbac-proxy/0.log" Oct 08 22:58:43 crc kubenswrapper[4739]: I1008 22:58:43.187987 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/manager/0.log" Oct 08 22:58:54 crc kubenswrapper[4739]: I1008 22:58:54.877221 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-ps6c2_714c1b10-3e7c-4a8a-a346-8e37f9f476e6/kube-rbac-proxy/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.102226 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-ps6c2_714c1b10-3e7c-4a8a-a346-8e37f9f476e6/controller/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.118640 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.264698 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.301117 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.307865 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.308975 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.651541 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.692404 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.729252 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.730027 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.880745 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-frr-files/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.894869 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-reloader/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.906314 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/cp-metrics/0.log" Oct 08 22:58:55 crc kubenswrapper[4739]: I1008 22:58:55.938954 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/controller/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.096282 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/kube-rbac-proxy/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.102969 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/frr-metrics/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.149352 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/kube-rbac-proxy-frr/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.332271 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/reloader/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.351763 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-rmj6d_d4c4cac2-1e41-4504-8620-7ccda1212854/frr-k8s-webhook-server/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.620641 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cb897566c-v8wrp_091a7a04-1c08-4327-8d95-e63d3b526055/manager/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.779722 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5944674dc5-rrrsh_1b3a65cd-e578-4a5b-acfe-47ec21816d80/webhook-server/0.log" Oct 08 22:58:56 crc kubenswrapper[4739]: I1008 22:58:56.847213 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kn5nb_6f0c8acb-ceae-4aea-861e-396755963f03/kube-rbac-proxy/0.log" Oct 08 22:58:57 crc kubenswrapper[4739]: I1008 22:58:57.416338 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kn5nb_6f0c8acb-ceae-4aea-861e-396755963f03/speaker/0.log" Oct 08 22:58:57 crc kubenswrapper[4739]: I1008 22:58:57.694898 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-phdnx_545c9b85-f531-4665-ba7c-8997de325d62/frr/0.log" Oct 08 22:59:10 crc kubenswrapper[4739]: I1008 22:59:10.878524 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.028843 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.051487 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.061778 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.235308 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/extract/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.238173 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/util/0.log" Oct 08 22:59:11 crc kubenswrapper[4739]: I1008 22:59:11.252649 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694tnb4c_ebd8cd89-f73e-48cc-99b9-59f14f0d9d54/pull/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.219108 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.391740 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.401209 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.408259 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.601636 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/util/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.628248 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/pull/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.650555 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tw94x_722cc9ee-6835-4563-a73d-9312179a7901/extract/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.783988 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.972485 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:59:12 crc kubenswrapper[4739]: I1008 22:59:12.996100 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.019265 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.158011 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/util/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.161450 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.206777 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24lmr8_ca4c31bf-494f-4278-97be-ef83f58c5c1b/extract/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.341469 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.483623 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.487483 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.516078 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.690611 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/pull/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.697898 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/util/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.778164 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.779798 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5zjvf_174d772a-ebc6-46bf-ab5f-02cdc6564283/extract/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.913952 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.923541 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:59:13 crc kubenswrapper[4739]: I1008 22:59:13.962938 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.165643 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-content/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.195370 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/extract-utilities/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.239210 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.391410 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.453129 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.505761 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.649320 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-utilities/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.700063 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/extract-content/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.812261 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qlrzs_9243f10e-b903-4d49-9ef7-d447cf6459fd/registry-server/0.log" Oct 08 22:59:14 crc kubenswrapper[4739]: I1008 22:59:14.925500 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.101682 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.103529 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.104932 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.304531 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/util/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.359349 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/extract/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.364750 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5nhc2_491673c0-7c8a-4f56-95c4-c06e79a87512/pull/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.437644 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mmlrg_cf066fed-0185-4563-9992-0474c1761110/registry-server/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.541805 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x54q2_6122ee5d-8e01-4fb7-b6bf-fc6d4ebcf669/marketplace-operator/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.644546 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.725447 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.757535 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.770211 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.962258 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-utilities/0.log" Oct 08 22:59:15 crc kubenswrapper[4739]: I1008 22:59:15.962888 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/extract-content/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.044654 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.085367 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lbrj8_0921b50a-3ca2-4f07-a060-63d6078eac48/registry-server/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.177453 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.213282 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.214653 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.375124 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-content/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.384297 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/extract-utilities/0.log" Oct 08 22:59:16 crc kubenswrapper[4739]: I1008 22:59:16.907312 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jhjjp_e4a5ca24-fef9-4cca-99c2-eb2c255ee795/registry-server/0.log" Oct 08 22:59:29 crc kubenswrapper[4739]: I1008 22:59:29.019062 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-7fkfg_bbafdc6e-b606-4274-aebb-eb1d38bf693e/prometheus-operator/0.log" Oct 08 22:59:29 crc kubenswrapper[4739]: I1008 22:59:29.127846 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64c8f55db4-5gwgs_711de91c-2cc4-4161-ac30-0de8e68283d5/prometheus-operator-admission-webhook/0.log" Oct 08 22:59:29 crc kubenswrapper[4739]: I1008 22:59:29.189851 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64c8f55db4-wpszd_80783ef0-8acc-4d99-bc23-6c6c7fbf2ee2/prometheus-operator-admission-webhook/0.log" Oct 08 22:59:29 crc kubenswrapper[4739]: I1008 22:59:29.305025 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-6kpbp_4c13b744-b744-49f3-8ba5-241ab69fdab9/operator/0.log" Oct 08 22:59:29 crc kubenswrapper[4739]: I1008 22:59:29.349824 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-bpxcp_5c19c761-fcc4-474d-9d87-7c2e07755190/perses-operator/0.log" Oct 08 22:59:43 crc kubenswrapper[4739]: I1008 22:59:43.249561 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/kube-rbac-proxy/0.log" Oct 08 22:59:43 crc kubenswrapper[4739]: I1008 22:59:43.434525 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-649bd47b54-74dfs_ab6cc895-0aa3-49a5-bec3-38efa4dd348f/manager/0.log" Oct 08 22:59:51 crc kubenswrapper[4739]: I1008 22:59:51.766704 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:59:51 crc kubenswrapper[4739]: I1008 22:59:51.767303 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.144742 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr"] Oct 08 23:00:00 crc kubenswrapper[4739]: E1008 23:00:00.145646 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="extract-content" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.145663 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="extract-content" Oct 08 23:00:00 crc kubenswrapper[4739]: E1008 23:00:00.145691 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" containerName="container-00" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.145701 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" containerName="container-00" Oct 08 23:00:00 crc kubenswrapper[4739]: E1008 23:00:00.145718 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="extract-utilities" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.145726 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="extract-utilities" Oct 08 23:00:00 crc kubenswrapper[4739]: E1008 23:00:00.145746 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.145753 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.146011 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="091d8ecb-934d-4c4c-ae6c-5ab8da44d75b" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.146037 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b98de0-5ba8-4c4e-aca5-538d9ca1ad95" containerName="container-00" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.147086 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.150365 4739 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.151023 4739 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.199315 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr"] Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.352896 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmq7\" (UniqueName: \"kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.353267 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.353538 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.455304 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.455982 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmq7\" (UniqueName: \"kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.456103 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.456619 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.473259 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.489587 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmq7\" (UniqueName: \"kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7\") pod \"collect-profiles-29332740-qgddr\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:00 crc kubenswrapper[4739]: I1008 23:00:00.770485 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:01 crc kubenswrapper[4739]: I1008 23:00:01.388516 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr"] Oct 08 23:00:01 crc kubenswrapper[4739]: I1008 23:00:01.453895 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" event={"ID":"e3b0d429-eb9b-46fe-a592-d73d84a1a9df","Type":"ContainerStarted","Data":"e949aebda20d94ba94a59a5b6335bcf0d2eaf8d348105c2195a3f8ed98ecb49d"} Oct 08 23:00:02 crc kubenswrapper[4739]: I1008 23:00:02.471229 4739 generic.go:334] "Generic (PLEG): container finished" podID="e3b0d429-eb9b-46fe-a592-d73d84a1a9df" containerID="a2a27727928074c353ea824da6079e211b7a9a8904d7df15daeeb02bb81b20cf" exitCode=0 Oct 08 23:00:02 crc kubenswrapper[4739]: I1008 23:00:02.471446 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" event={"ID":"e3b0d429-eb9b-46fe-a592-d73d84a1a9df","Type":"ContainerDied","Data":"a2a27727928074c353ea824da6079e211b7a9a8904d7df15daeeb02bb81b20cf"} Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.061524 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.246755 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmq7\" (UniqueName: \"kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7\") pod \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.247060 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume\") pod \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.247117 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume\") pod \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\" (UID: \"e3b0d429-eb9b-46fe-a592-d73d84a1a9df\") " Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.247511 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3b0d429-eb9b-46fe-a592-d73d84a1a9df" (UID: "e3b0d429-eb9b-46fe-a592-d73d84a1a9df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.252351 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7" (OuterVolumeSpecName: "kube-api-access-5nmq7") pod "e3b0d429-eb9b-46fe-a592-d73d84a1a9df" (UID: "e3b0d429-eb9b-46fe-a592-d73d84a1a9df"). InnerVolumeSpecName "kube-api-access-5nmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.253244 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3b0d429-eb9b-46fe-a592-d73d84a1a9df" (UID: "e3b0d429-eb9b-46fe-a592-d73d84a1a9df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.348175 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nmq7\" (UniqueName: \"kubernetes.io/projected/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-kube-api-access-5nmq7\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.348212 4739 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.348221 4739 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3b0d429-eb9b-46fe-a592-d73d84a1a9df-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.487897 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" event={"ID":"e3b0d429-eb9b-46fe-a592-d73d84a1a9df","Type":"ContainerDied","Data":"e949aebda20d94ba94a59a5b6335bcf0d2eaf8d348105c2195a3f8ed98ecb49d"} Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.487937 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e949aebda20d94ba94a59a5b6335bcf0d2eaf8d348105c2195a3f8ed98ecb49d" Oct 08 23:00:04 crc kubenswrapper[4739]: I1008 23:00:04.488009 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-qgddr" Oct 08 23:00:05 crc kubenswrapper[4739]: I1008 23:00:05.162319 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq"] Oct 08 23:00:05 crc kubenswrapper[4739]: I1008 23:00:05.171953 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-7wxxq"] Oct 08 23:00:05 crc kubenswrapper[4739]: I1008 23:00:05.833529 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c83ca3a-410d-4587-a721-2642fa984c8b" path="/var/lib/kubelet/pods/2c83ca3a-410d-4587-a721-2642fa984c8b/volumes" Oct 08 23:00:21 crc kubenswrapper[4739]: I1008 23:00:21.765679 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:00:21 crc kubenswrapper[4739]: I1008 23:00:21.766376 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:00:34 crc kubenswrapper[4739]: I1008 23:00:34.943705 4739 scope.go:117] "RemoveContainer" containerID="26c8a09d0824092f6762f38092a887c1d0cdbfc55c5fa6bf5c8f8679e3431ac6" Oct 08 23:00:51 crc kubenswrapper[4739]: I1008 23:00:51.766593 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:00:51 crc kubenswrapper[4739]: I1008 23:00:51.767308 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:00:51 crc kubenswrapper[4739]: I1008 23:00:51.767365 4739 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" Oct 08 23:00:51 crc kubenswrapper[4739]: I1008 23:00:51.768403 4739 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61ebb047d55ba2ada03295b962b140741fadb555af57a367f41a3bf5f65633ac"} pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:00:51 crc kubenswrapper[4739]: I1008 23:00:51.768470 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" containerID="cri-o://61ebb047d55ba2ada03295b962b140741fadb555af57a367f41a3bf5f65633ac" gracePeriod=600 Oct 08 23:00:52 crc kubenswrapper[4739]: I1008 23:00:52.017023 4739 generic.go:334] "Generic (PLEG): container finished" podID="9707b708-016c-4e06-86db-0332e2ca37db" containerID="61ebb047d55ba2ada03295b962b140741fadb555af57a367f41a3bf5f65633ac" exitCode=0 Oct 08 23:00:52 crc kubenswrapper[4739]: I1008 23:00:52.017088 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerDied","Data":"61ebb047d55ba2ada03295b962b140741fadb555af57a367f41a3bf5f65633ac"} Oct 08 23:00:52 crc kubenswrapper[4739]: I1008 23:00:52.017126 4739 scope.go:117] "RemoveContainer" containerID="798afc98cb7bcde18fde2f74c09601ba5107495b2a2a7e64ab302b80311e0904" Oct 08 23:00:53 crc kubenswrapper[4739]: I1008 23:00:53.032279 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" event={"ID":"9707b708-016c-4e06-86db-0332e2ca37db","Type":"ContainerStarted","Data":"656f34ce02f3d867f9c0e0b2dce0e7f48055d989c3a87997a8794cb64f57e040"} Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.143691 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29332741-86l92"] Oct 08 23:01:00 crc kubenswrapper[4739]: E1008 23:01:00.144445 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b0d429-eb9b-46fe-a592-d73d84a1a9df" containerName="collect-profiles" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.144458 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b0d429-eb9b-46fe-a592-d73d84a1a9df" containerName="collect-profiles" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.144686 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b0d429-eb9b-46fe-a592-d73d84a1a9df" containerName="collect-profiles" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.145657 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.160306 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332741-86l92"] Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.254984 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.255426 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2czxp\" (UniqueName: \"kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.255498 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.255537 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.357847 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.358008 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.358062 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2czxp\" (UniqueName: \"kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.358104 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.365069 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.366230 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.370042 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.376506 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2czxp\" (UniqueName: \"kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp\") pod \"keystone-cron-29332741-86l92\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.479917 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:00 crc kubenswrapper[4739]: I1008 23:01:00.968666 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29332741-86l92"] Oct 08 23:01:01 crc kubenswrapper[4739]: I1008 23:01:01.124670 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332741-86l92" event={"ID":"0f2f8012-79a5-42e1-b3a8-8c579f108f3f","Type":"ContainerStarted","Data":"b3bf5dcfedd4c9dec8898ec762a43ae65848eeb415cef1f252504e16d6dee6cb"} Oct 08 23:01:02 crc kubenswrapper[4739]: I1008 23:01:02.140350 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332741-86l92" event={"ID":"0f2f8012-79a5-42e1-b3a8-8c579f108f3f","Type":"ContainerStarted","Data":"fd5e07d97a602d1d1ac63b0aabea818ac3c3049f900aebcc29617c60f552e25e"} Oct 08 23:01:02 crc kubenswrapper[4739]: I1008 23:01:02.165589 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29332741-86l92" podStartSLOduration=2.165567054 podStartE2EDuration="2.165567054s" podCreationTimestamp="2025-10-08 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 23:01:02.161013493 +0000 UTC m=+4361.986399243" watchObservedRunningTime="2025-10-08 23:01:02.165567054 +0000 UTC m=+4361.990952804" Oct 08 23:01:05 crc kubenswrapper[4739]: I1008 23:01:05.170788 4739 generic.go:334] "Generic (PLEG): container finished" podID="0f2f8012-79a5-42e1-b3a8-8c579f108f3f" containerID="fd5e07d97a602d1d1ac63b0aabea818ac3c3049f900aebcc29617c60f552e25e" exitCode=0 Oct 08 23:01:05 crc kubenswrapper[4739]: I1008 23:01:05.171560 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332741-86l92" event={"ID":"0f2f8012-79a5-42e1-b3a8-8c579f108f3f","Type":"ContainerDied","Data":"fd5e07d97a602d1d1ac63b0aabea818ac3c3049f900aebcc29617c60f552e25e"} Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.658573 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.790739 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2czxp\" (UniqueName: \"kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp\") pod \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.790784 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys\") pod \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.790953 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data\") pod \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.791084 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle\") pod \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\" (UID: \"0f2f8012-79a5-42e1-b3a8-8c579f108f3f\") " Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.796474 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0f2f8012-79a5-42e1-b3a8-8c579f108f3f" (UID: "0f2f8012-79a5-42e1-b3a8-8c579f108f3f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.797026 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp" (OuterVolumeSpecName: "kube-api-access-2czxp") pod "0f2f8012-79a5-42e1-b3a8-8c579f108f3f" (UID: "0f2f8012-79a5-42e1-b3a8-8c579f108f3f"). InnerVolumeSpecName "kube-api-access-2czxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.823085 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f2f8012-79a5-42e1-b3a8-8c579f108f3f" (UID: "0f2f8012-79a5-42e1-b3a8-8c579f108f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.855402 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data" (OuterVolumeSpecName: "config-data") pod "0f2f8012-79a5-42e1-b3a8-8c579f108f3f" (UID: "0f2f8012-79a5-42e1-b3a8-8c579f108f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.892982 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2czxp\" (UniqueName: \"kubernetes.io/projected/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-kube-api-access-2czxp\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.893014 4739 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.893024 4739 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:06 crc kubenswrapper[4739]: I1008 23:01:06.893032 4739 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2f8012-79a5-42e1-b3a8-8c579f108f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:07 crc kubenswrapper[4739]: I1008 23:01:07.190707 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29332741-86l92" event={"ID":"0f2f8012-79a5-42e1-b3a8-8c579f108f3f","Type":"ContainerDied","Data":"b3bf5dcfedd4c9dec8898ec762a43ae65848eeb415cef1f252504e16d6dee6cb"} Oct 08 23:01:07 crc kubenswrapper[4739]: I1008 23:01:07.190753 4739 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bf5dcfedd4c9dec8898ec762a43ae65848eeb415cef1f252504e16d6dee6cb" Oct 08 23:01:07 crc kubenswrapper[4739]: I1008 23:01:07.190823 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29332741-86l92" Oct 08 23:01:17 crc kubenswrapper[4739]: I1008 23:01:17.300885 4739 generic.go:334] "Generic (PLEG): container finished" podID="a42172b8-81cc-43e2-9733-25b845571bf9" containerID="72933523cc3fde577b36eeeb6f81691b68093fd7babea27d9c72f4f7567f2837" exitCode=0 Oct 08 23:01:17 crc kubenswrapper[4739]: I1008 23:01:17.301050 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" event={"ID":"a42172b8-81cc-43e2-9733-25b845571bf9","Type":"ContainerDied","Data":"72933523cc3fde577b36eeeb6f81691b68093fd7babea27d9c72f4f7567f2837"} Oct 08 23:01:17 crc kubenswrapper[4739]: I1008 23:01:17.303573 4739 scope.go:117] "RemoveContainer" containerID="72933523cc3fde577b36eeeb6f81691b68093fd7babea27d9c72f4f7567f2837" Oct 08 23:01:18 crc kubenswrapper[4739]: I1008 23:01:18.048478 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xwsg_must-gather-jh2q7_a42172b8-81cc-43e2-9733-25b845571bf9/gather/0.log" Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.168860 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7xwsg/must-gather-jh2q7"] Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.169547 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="copy" containerID="cri-o://91f517e18e4050a09cab149d9deda946f3572e23f8134f1adf603701a2d02355" gracePeriod=2 Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.180415 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7xwsg/must-gather-jh2q7"] Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.418664 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xwsg_must-gather-jh2q7_a42172b8-81cc-43e2-9733-25b845571bf9/copy/0.log" Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.419611 4739 generic.go:334] "Generic (PLEG): container finished" podID="a42172b8-81cc-43e2-9733-25b845571bf9" containerID="91f517e18e4050a09cab149d9deda946f3572e23f8134f1adf603701a2d02355" exitCode=143 Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.746912 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xwsg_must-gather-jh2q7_a42172b8-81cc-43e2-9733-25b845571bf9/copy/0.log" Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.747284 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.891258 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output\") pod \"a42172b8-81cc-43e2-9733-25b845571bf9\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.891479 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvbw\" (UniqueName: \"kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw\") pod \"a42172b8-81cc-43e2-9733-25b845571bf9\" (UID: \"a42172b8-81cc-43e2-9733-25b845571bf9\") " Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.897998 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw" (OuterVolumeSpecName: "kube-api-access-hpvbw") pod "a42172b8-81cc-43e2-9733-25b845571bf9" (UID: "a42172b8-81cc-43e2-9733-25b845571bf9"). InnerVolumeSpecName "kube-api-access-hpvbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:01:28 crc kubenswrapper[4739]: I1008 23:01:28.994510 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvbw\" (UniqueName: \"kubernetes.io/projected/a42172b8-81cc-43e2-9733-25b845571bf9-kube-api-access-hpvbw\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.046352 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a42172b8-81cc-43e2-9733-25b845571bf9" (UID: "a42172b8-81cc-43e2-9733-25b845571bf9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.096969 4739 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a42172b8-81cc-43e2-9733-25b845571bf9-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.445194 4739 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7xwsg_must-gather-jh2q7_a42172b8-81cc-43e2-9733-25b845571bf9/copy/0.log" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.446540 4739 scope.go:117] "RemoveContainer" containerID="91f517e18e4050a09cab149d9deda946f3572e23f8134f1adf603701a2d02355" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.446640 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xwsg/must-gather-jh2q7" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.473530 4739 scope.go:117] "RemoveContainer" containerID="72933523cc3fde577b36eeeb6f81691b68093fd7babea27d9c72f4f7567f2837" Oct 08 23:01:29 crc kubenswrapper[4739]: I1008 23:01:29.836563 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" path="/var/lib/kubelet/pods/a42172b8-81cc-43e2-9733-25b845571bf9/volumes" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.044952 4739 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:01:43 crc kubenswrapper[4739]: E1008 23:01:43.045878 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="gather" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.045890 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="gather" Oct 08 23:01:43 crc kubenswrapper[4739]: E1008 23:01:43.045904 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2f8012-79a5-42e1-b3a8-8c579f108f3f" containerName="keystone-cron" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.045910 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2f8012-79a5-42e1-b3a8-8c579f108f3f" containerName="keystone-cron" Oct 08 23:01:43 crc kubenswrapper[4739]: E1008 23:01:43.045925 4739 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="copy" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.045931 4739 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="copy" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.046114 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2f8012-79a5-42e1-b3a8-8c579f108f3f" containerName="keystone-cron" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.046130 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="gather" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.046165 4739 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42172b8-81cc-43e2-9733-25b845571bf9" containerName="copy" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.047699 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.058201 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.198451 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.198548 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.198617 4739 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vkj\" (UniqueName: \"kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.300561 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.300623 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.300660 4739 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vkj\" (UniqueName: \"kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.301025 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.301048 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.320233 4739 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vkj\" (UniqueName: \"kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj\") pod \"community-operators-njpjw\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.375967 4739 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:43 crc kubenswrapper[4739]: I1008 23:01:43.929522 4739 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:01:44 crc kubenswrapper[4739]: I1008 23:01:44.612771 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerStarted","Data":"81406dae86d0cda600b50120d98bb61e0f20a7de843af613f444a3b8838bc58b"} Oct 08 23:01:45 crc kubenswrapper[4739]: I1008 23:01:45.626366 4739 generic.go:334] "Generic (PLEG): container finished" podID="05e40c28-ea21-49f3-bba4-ddd097d5d4d7" containerID="5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5" exitCode=0 Oct 08 23:01:45 crc kubenswrapper[4739]: I1008 23:01:45.626492 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerDied","Data":"5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5"} Oct 08 23:01:45 crc kubenswrapper[4739]: I1008 23:01:45.629533 4739 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 23:01:47 crc kubenswrapper[4739]: I1008 23:01:47.648768 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerStarted","Data":"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae"} Oct 08 23:01:48 crc kubenswrapper[4739]: I1008 23:01:48.658750 4739 generic.go:334] "Generic (PLEG): container finished" podID="05e40c28-ea21-49f3-bba4-ddd097d5d4d7" containerID="74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae" exitCode=0 Oct 08 23:01:48 crc kubenswrapper[4739]: I1008 23:01:48.659066 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerDied","Data":"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae"} Oct 08 23:01:49 crc kubenswrapper[4739]: I1008 23:01:49.673103 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerStarted","Data":"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780"} Oct 08 23:01:49 crc kubenswrapper[4739]: I1008 23:01:49.716330 4739 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-njpjw" podStartSLOduration=3.119129019 podStartE2EDuration="6.716302346s" podCreationTimestamp="2025-10-08 23:01:43 +0000 UTC" firstStartedPulling="2025-10-08 23:01:45.62926722 +0000 UTC m=+4405.454652970" lastFinishedPulling="2025-10-08 23:01:49.226440547 +0000 UTC m=+4409.051826297" observedRunningTime="2025-10-08 23:01:49.702815975 +0000 UTC m=+4409.528201745" watchObservedRunningTime="2025-10-08 23:01:49.716302346 +0000 UTC m=+4409.541688106" Oct 08 23:01:53 crc kubenswrapper[4739]: I1008 23:01:53.377001 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:53 crc kubenswrapper[4739]: I1008 23:01:53.377624 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:01:53 crc kubenswrapper[4739]: I1008 23:01:53.478108 4739 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:02:03 crc kubenswrapper[4739]: I1008 23:02:03.460964 4739 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:02:03 crc kubenswrapper[4739]: I1008 23:02:03.545818 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:02:03 crc kubenswrapper[4739]: I1008 23:02:03.840175 4739 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-njpjw" podUID="05e40c28-ea21-49f3-bba4-ddd097d5d4d7" containerName="registry-server" containerID="cri-o://e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780" gracePeriod=2 Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.348789 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.399957 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4vkj\" (UniqueName: \"kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj\") pod \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.400133 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities\") pod \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.400373 4739 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content\") pod \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\" (UID: \"05e40c28-ea21-49f3-bba4-ddd097d5d4d7\") " Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.401052 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities" (OuterVolumeSpecName: "utilities") pod "05e40c28-ea21-49f3-bba4-ddd097d5d4d7" (UID: "05e40c28-ea21-49f3-bba4-ddd097d5d4d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.410786 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj" (OuterVolumeSpecName: "kube-api-access-q4vkj") pod "05e40c28-ea21-49f3-bba4-ddd097d5d4d7" (UID: "05e40c28-ea21-49f3-bba4-ddd097d5d4d7"). InnerVolumeSpecName "kube-api-access-q4vkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.454933 4739 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05e40c28-ea21-49f3-bba4-ddd097d5d4d7" (UID: "05e40c28-ea21-49f3-bba4-ddd097d5d4d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.502575 4739 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.502616 4739 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.502631 4739 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4vkj\" (UniqueName: \"kubernetes.io/projected/05e40c28-ea21-49f3-bba4-ddd097d5d4d7-kube-api-access-q4vkj\") on node \"crc\" DevicePath \"\"" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.853892 4739 generic.go:334] "Generic (PLEG): container finished" podID="05e40c28-ea21-49f3-bba4-ddd097d5d4d7" containerID="e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780" exitCode=0 Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.853951 4739 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njpjw" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.853942 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerDied","Data":"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780"} Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.854066 4739 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njpjw" event={"ID":"05e40c28-ea21-49f3-bba4-ddd097d5d4d7","Type":"ContainerDied","Data":"81406dae86d0cda600b50120d98bb61e0f20a7de843af613f444a3b8838bc58b"} Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.854159 4739 scope.go:117] "RemoveContainer" containerID="e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.878466 4739 scope.go:117] "RemoveContainer" containerID="74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.909934 4739 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.920184 4739 scope.go:117] "RemoveContainer" containerID="5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.920333 4739 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-njpjw"] Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.949397 4739 scope.go:117] "RemoveContainer" containerID="e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780" Oct 08 23:02:04 crc kubenswrapper[4739]: E1008 23:02:04.949974 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780\": container with ID starting with e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780 not found: ID does not exist" containerID="e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.950018 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780"} err="failed to get container status \"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780\": rpc error: code = NotFound desc = could not find container \"e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780\": container with ID starting with e7c3fb02eb8dca054957d95e7ced567a645cd71feb5ea8be015684cf47746780 not found: ID does not exist" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.950056 4739 scope.go:117] "RemoveContainer" containerID="74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae" Oct 08 23:02:04 crc kubenswrapper[4739]: E1008 23:02:04.950421 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae\": container with ID starting with 74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae not found: ID does not exist" containerID="74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.950459 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae"} err="failed to get container status \"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae\": rpc error: code = NotFound desc = could not find container \"74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae\": container with ID starting with 74f085657017be54d90121d9e628fd29091b0fd073c1d0f45769c7d9803e65ae not found: ID does not exist" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.950486 4739 scope.go:117] "RemoveContainer" containerID="5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5" Oct 08 23:02:04 crc kubenswrapper[4739]: E1008 23:02:04.950796 4739 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5\": container with ID starting with 5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5 not found: ID does not exist" containerID="5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5" Oct 08 23:02:04 crc kubenswrapper[4739]: I1008 23:02:04.950845 4739 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5"} err="failed to get container status \"5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5\": rpc error: code = NotFound desc = could not find container \"5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5\": container with ID starting with 5c7cc85e8430af221b63cbd0758762cb47124279807410f5695bea1150d36ce5 not found: ID does not exist" Oct 08 23:02:05 crc kubenswrapper[4739]: I1008 23:02:05.833298 4739 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e40c28-ea21-49f3-bba4-ddd097d5d4d7" path="/var/lib/kubelet/pods/05e40c28-ea21-49f3-bba4-ddd097d5d4d7/volumes" Oct 08 23:02:35 crc kubenswrapper[4739]: I1008 23:02:35.080120 4739 scope.go:117] "RemoveContainer" containerID="07836bf39f714842b0ee7ce311ee0da88ce783fd64988b65d5ba17dfa1a2eedb" Oct 08 23:03:21 crc kubenswrapper[4739]: I1008 23:03:21.766911 4739 patch_prober.go:28] interesting pod/machine-config-daemon-dwvs2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:03:21 crc kubenswrapper[4739]: I1008 23:03:21.770135 4739 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwvs2" podUID="9707b708-016c-4e06-86db-0332e2ca37db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"